carefree0910 / carefree-creator

AI magics meet Infinite draw board.

Home Page:https://creator.nolibox.com/guest

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

On which port does the local server start?

bropines opened this issue Β· comments

I run the program through colab through the bore tunnel. The thing is that the local server cannot see the assigned port.

The local port is 8123 by default, and since you've already seen the {"detail": "Not Found"}, it should be OK to use the url (bropines.online:36849) already πŸ˜‰

Also, by running

cfcreator serve --help

You may find that you can specify the port via

cfcreator serve -p xxxx

The local port is 8123 by default, and since you've already seen the {"detail": "Not Found"}, it should be OK to use the url (bropines.online:36849) already πŸ˜‰

But it doesn't give the gui to me. Or is it supposed to be like that?
Initially, I thought that the server runs on port 8123 on localhost and I just need to proxy it into the network, through the tunnel. But for some reason it didn't happen.

I seem to get it. We create a server in the colab and connect it here
image

Either my tunnel is a piece of garbage crap, or besides ngrok it doesn't work at all

Either I'm an idiot or something is wrong. But the normal version on your server calmly outputs the result, running it on a local server or on a colab does not output anything. Stupidly endless loading.

That looks weird, might be a bug... I'll look into it 😣

Hi, I found that if you use the Try these out! section, the generation process will be stuck; but if you press the Generate & Add button directly, you can get the results. This is definitely a bug and I'll fix it ASAP!

in general, it is planned to lay out the frontend code so that people just run a local server, if they can. For example, naifu materials or the shell from automatic1111 simply runs as WebUI directly on the Colab server.

I would also like a button in the local server settings that checks the connection. I think many people will feel calmer that they haven't messed up anywhere

Sorry for bringing you terrible experience, the bug comes from the WebUI part - I emit an event before I listen to it, so the first try will be messed up. 😣

Now I've updated the WebUI, everything should be fine after you refresh it!

For your advice: the whole project was designed to decouple frontend and the backend, which is pretty different from the typical Gradio-based projects (e.g. automatic1111). I designed in this way because I want to bring more flexibility to it: people can build their own frontend based on the backend APIs, or can build their own backend based on the frontend API interfaces.

And this project, carefree-creator, is the backend part. πŸ˜‰ The frontend part is scheduled to be open-sourced, as mentioned here.

I know that Gradio-based projects also supports exposing backend APIs, but I think those APIs are not as standard/flexible as I need. πŸ˜”

image
I must have broken something again.

Ah, looks weird... Maybe you misfilled the Server Host? The ngrok address will change everytime you run the colab.

To check out if the Server Host is filled correctly, you can copy paste it to your browser and see if it shows {"detail": "Not Found"}. πŸ˜‰

Ah, looks weird... Maybe you misfilled the Server Host? The ngrok address will change everytime you run the colab.

To check out if the Server Host is filled correctly, you can copy paste it to your browser and see if it shows {"detail": "Not Found"}. πŸ˜‰

I understood that. And when you click on the link, he writes not found. But here that on my tunnel that on ngrok it does not work. I can't run it now, but I can give you a link to notepad for your tests

Turns out that it's an https issue, which can be observed if you open the console of your browser:

image

I'm not familiar with bore, but if you can get it serve https service, it should be fine!

Turns out that it's an https issue, which can be observed if you open the console of your browser:

image

I'm not familiar with bore, but if you can get it serve https service, it should be fine!

Here I used LocalTunnel. I think I can make it work with a certificate

browser_Wz5JRUCoau.mp4

Presets not working)

Oh, the ACG Paints application is actually an internal testing feature that should not appear here yet. 🀣 I'll hide them tomorrow.

Oh, the ACG Paints application is actually an internal testing feature that should not appear here yet. 🀣 I'll hide them tomorrow.

A.... OK.

Because this application is not fully 'local' at the mean time, I'm using some third party APIs which may cost me quite a few dollars. 🀣 Once I localized those algorithms, I'll send it back online! πŸ˜‰

Because this application is not fully 'local' at the mean time, I'm using some third party APIs which may cost me quite a few dollars. 🀣 Once I localized those algorithms, I'll send it back online! πŸ˜‰

Okay. Here's another question. Will there be VAE support for models? I just didn't see it in the settings

You mean only loading your own VAE? Not for now because that will be a little too fine grained. 😣 But you can load your own (full) ckpt: https://github.com/carefree0910/carefree-creator#custom-checkpoints