RoboSats / robosats

A simple and private bitcoin exchange

Home Page:https://learn.robosats.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

lnproxy support is broken

opened this issue · comments

Describe the bug
When specifying Lightning payout info (step 2/4) it's not possible to use lnproxy due to errors.

To Reproduce
Steps to reproduce the behavior:

  1. Take an offer
  2. Pay HODL invoice
  3. Enable the advanced toggle
  4. Check the "Use Lnproxy" checkbox
  5. Select "Lnproxy Dev" from the dropdown
  6. Specify a proxy budget
  7. Paste your wrapped invoice
  8. Click "WRAP"

Expected behavior
I'd expect the trade to move to the next step

Screenshots
image

Desktop (please complete the following information):

  • OS: macOS
  • Browser: Tor
  • Version: 12.0.5

Additional context
It seems that there has been a change to the lnproxy backend. Specifically, the relay list on lnproxy.org here:

https://github.com/lnproxy/lnproxy-webui2/blob/main/index.html#L71

does not match what is being used by robosats

https://github.com/RoboSats/robosats/blob/main/frontend/static/lnproxies.json

https://github.com/lnproxy/lnproxy-webui2/blob/7ff4c5d524c1b4d9eeb8167ff1f60314eb46017d/index.html#L71-L73
Ideally these are extracted into a stand alone JSON file that we can directly fetch from the RoboSats client. This way we only need to keep the list of available LN proxy servers up-to-date on one repo.

We can more closely shape our Wrap fetch function to resemble:
https://github.com/lnproxy/lnproxy-webui2/blob/7ff4c5d524c1b4d9eeb8167ff1f60314eb46017d/assets/main.js#L36-L70

In addition, payment_hash check is easy to perform as we already have the Bolt11 decoder on out component.

Assuming the relays are extracted into a standalone JSON file, what would be the best way to "fetch from the RoboSats client"? Would it literally be using fetch() i.e. using the apiClient in LightningPayout.tsx to directly fetch the relay list from something like https://raw.githubusercontent.com/lnproxy/lnproxy-webui2/assets/relays.json ?

Thank you @shyfire131 for pushing forward this issue!

Would it literally be using fetch() i.e. using the apiClient in LightningPayout.tsx to directly fetch the relay list from something like https://raw.githubusercontent.com/lnproxy/lnproxy-webui2/assets/relays.json ?

There is a few options. Certainly the one you propose is the most straightforward but it has some cons: 1) a fetch to github is not great privacy-wise 2) would require handling a small loading time 3) if github is down (or the fetch() fails) the lnproxy functionality won't work.

Other options are:

  1. modify the npm build and dev scripts to download relays.json just before building the frontend bundle and including it within. E.g:
// On frontend/package.json
{
    "scripts": {
        ...
        "build": "curl location-of-relays.json && webpack --mode production",
      },
    ...
}

This way on every robosats release the latest relays are added and the client does not need to fetch anything.

  1. ... maintain our own relays.json. As you have seen, we have a more complex setup to automatically detect the bitcoin network mainnet | testnet and the endpoint network tor | clearnet | i2p . The relays.json maintained at lnproxy.org only has mainnet & tor endpoints, but these are not always useful. So unless lnproxy.org adopts an approach similar to us for the specifications of the relays file, we would need to maintain our own.

Hmm true, good catch @ privacy. It would go over Tor right, but if someone is using the unsafe mirror and without a VPN then that could be an issue. Anyway for all those reasons I totally agree it's not the best.

So what do?

I am not a fan of maintaining the list of relays in multiple places, in practice we will not periodically check the lnproxy repo and just forget about it. However, if we went with option 1, we would still need a mapping to the internal structures, since currently there is 1 clearnet and 2 tor relays meaning that there is not a 1:1 mapping on their relay list - I had punted on figuring this out but now is the time.

What I suggest is the following:

  1. Download relays.json from their repo during npm build and dev as you suggest
  2. Replace the current direct import with an "import and map", using the following scheme:

Given a relay list of the form

['host1.onion/spec', 'host2.onion/spec', 'host3.clear.net/spec', 'host4.clear.net/spec']

For Tor users, we map the above to this internal structure :

[
  {
    "name": "↬ Lnproxy Tor 1",
    "mainnetClearnet": "undefined",
    "mainnetTOR": "host1.onion/spec",
    "mainnetI2P": "undefined",
    "testnetClearnet": "undefined",
    "testnetTOR": "undefined",
    "testnetI2P": "undefined"
  },
  {
    "name": "↬ Lnproxy Tor 2",
    "mainnetClearnet": "undefined",
    "mainnetTOR": "host2.onion/spec",
    "mainnetI2P": "undefined",
    "testnetClearnet": "undefined",
    "testnetTOR": "undefined",
    "testnetI2P": "undefined"
  }
]

For clearnet users, we map it to this structure:

[
  {
    "name": "↬ Lnproxy Clearnet 1",
    "mainnetClearnet": "host3.clear.net/spec",
    "mainnetTOR": "undefined",
    "mainnetI2P": "undefined",
    "testnetClearnet": "undefined",
    "testnetTOR": "undefined",
    "testnetI2P": "undefined"
  },
  {
    "name": "↬ Lnproxy Clearnet 2",
    "mainnetClearnet": "undefined",
    "mainnetTOR": "host3.clear.net/spec",
    "mainnetI2P": "undefined",
    "testnetClearnet": "undefined",
    "testnetTOR": "undefined",
    "testnetI2P": "undefined"
  }
]

This still doesn't feel clean though, and I suspect a larger refactor of the internal structures are required, to rather use a JSON structure like this:

[
  {
    "name": string,
    "url": string,
    "type": Tor | Clearnet | I2P
    "network": mainnet | testnet
  }
]
[
  {
    "name": string,
    "url": string,
    "type": Tor | Clearnet | I2P
    "network": mainnet | testnet
  }
]

Yes! This is better. Initially I thought each Lnproxy service will be providing several of these access points, however it looks like that won't be the case.

If we want to get fancy and have the best of both worlds (i.e, maintain our own lnproxies.json so robosats codebase is self-sufficient, yet, pull new additions/deletions from lnproxy repository), we could write a small github workflow automation to look for changes in /lnproxy/lnproxy-webui2/assets/relays.json weekly, and if there are changes edit and commit a new lnproxies.json on this repository (sticking to the format that works for robosats). It's an overkill, but hey, sounds like fun to code! I'm sure GPT 4 can one-shot this github workflow.

Sounds good.

Hah, I also noticed the underlying API changed entirely. It now uses POST instead of GET

https://github.com/lnproxy/lnproxy-webui2/blob/7ff4c5d524c1b4d9eeb8167ff1f60314eb46017d/assets/main.js#L65