chrisdiana / cms.js

Client-Side JavaScript Site Generator

Home Page:http://chrisdiana.github.io/cms.js

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Allow to publish new content

piranna opened this issue · comments

It's possible to use https://github.com/azu/git-commit-push-via-github-api to create commits on Github using its API, so could it be possible to add support to create new blog entries directly from the browser, adding new files with its content?

Hey @piranna I think you would need an API key in order to push new commits to the repo so it most likely couldn't be done standalone within the config.js file. If it was in that file, it would be really dangerous that your secret API key would be visible for anyone since it's all public facing but....

There could be a way to do it in browser through CMS.js. Although, I'm not sure how ideal it would be. In theory, we could allow you to input your secret key each time you want to push a new commit OR store that secret key locally in browser storage so only the author has access to those Github actions. This method would take a bit of work to incorporate into CMS.js but it's definitely a cool concept.

In the meantime, you can actually edit your content and commit in-browser by just going to your repo's files and click the "Edit this file" button near the top. The only limitation I see right now is adding new files.

screen shot 2018-12-09 at 8 54 28 pm

At https://github.com/NodeOS/GitBlog we were using Github authentication to bypass requests limit, but it would be possible also to only allow to publish new content to the blog owner or the ones authorized :-) Owner would only need to login in Github and later with OAuth2 he would be able to post new entries and upload new files since a CMS/blog-like interface instead of dealing directly with the site files. It's just like what's currently being done to fetch the data files, but authenticated and also allowing to modify them and publish new ones.

Very cool @piranna. Conceptually, the GitBlog project seems really similar to this project! I wished I would have stumbled across this a few years ago when I started this.

The API limit is definitely something I've been trying to figure out ways to bypass and I think this could be a possible alternative for authenticated users. Also, the blog-like interface is something I've heard users would love to have as an option. We'll still have to figure out how to bypass request limits for unauthenticated users as a user's blog might start to grow but hopefully a solution will arise soon.

That being said, I'm definitely on-board with adding the blog-like editing interface to the roadmap...especially if we incorporate authentication into the lib.

I'm gonna jump into your project and explore it a little bit.

Very cool @piranna. Conceptually, the GitBlog project seems really similar to this project! I wished I would have stumbled across this a few years ago when I started this.

I'm glad you liked it :-) In fact, the idea behing it was to use GitHub issues as a blogging storage, so it can also work as a forum and mail-list... Not exactly the same, but I can see the similarities :-) But it's cleaner to have it in a plain-old git repo, and also allow to move it anywhere easily, that's why I prefer to move to cms.js :-)

We'll still have to figure out how to bypass request limits for unauthenticated users as a user's blog might start to grow but hopefully a solution will arise soon

I don't know if this would be feasable... but left it just only for authenticated users would be a good and easy first movement. Maybe limiting the number of requests to the minimum needed? Using caching on client side (localStorage, IndexedDB...)? I know this can be fairly easy with redux-offline...

That being said, I'm definitely on-board with adding the blog-like editing interface to the roadmap...especially if we incorporate authentication into the lib.

If you need help don't exhitate to ask me :-)

Unauthenticated rate limit is of 60 request per hour per user, and authenticated ones is of 5000 request per hour per user. This can be increased by using a Github App token, but this limit to 5000 request per hour per app (since we are authenticated, but being ourselves the user doing the requests) and doing it client-side is insecure. Alternatively it's possible to use conditional requests so we can cache them client-side as I suggested before. Finally, another alternative would be to use GraphQL API that has differente rates, but I don't know if it's possible to use it anonimously. I would go with the caching + conditional requests, I'm not sure if there would be already any library for that in npm...

Awesome @piranna thanks for digging into the API rate limits. That’s definitely helpful info.

I like the idea of using localStorage as a caching and maybe conditional requests (if it doesn’t break other functionality). I think that could be a good solution for now to help limit the number of API requests needed. Personally, I’d prefer to stay away from any dependiencies (if possible—besides devDependencies) to keep the library lightweight.

I’m in the process of setting up the feature roadmap here so I’ll let you know when that’s ready and we can add some of these items to it.

Also, thanks for offering up your help! It would be great to get you on board to help out with some of this stuff if you are interested.

Awesome @piranna thanks for digging into the API rate limits.

You are welcome :-)

Personally, I’d prefer to stay away from any dependiencies (if possible—besides devDependencies) to keep the library lightweight.

We could keep an eye on the library size with size-limit, to know exactly when things are going too far. If we make use of caching and conditional requests, it make sense to use already available modules, or at least to split the functionality and allow reusability. For example, access to files can be done with regular HTTP requests and only limit usage of Github API to the bare minimum, this will decrease API usage and would allow to make more requests but will make code more complex because there will be two different ways to access to Github, it makes sense to wrap them and offer a single API to the application, no matter where the data came from.

Also, thanks for offering up your help! It would be great to get you on board to help out with some of this stuff if you are interested.

You are welcome :-)

I've found a problem related to the comments: since they could be done by anybody, they would need to have commit permissions to the repo, that mostly would not be the case. Alternatively they can be set as commit authors and being ourselves the commiter, so since we have permissions it could be done, but then we would need to have our token publicly, that's a security issue. To publish blog entries there's no problem at all, because we as blog owners we have repo permissions, but comments would need to use Discuss or Twitter or anything else. Alternatively, in the same way it could be possible to craft a server that will receive the comment and publish the commit with it, so the token can be secured...

I think noddity can be of your interest, it uses pull-requests to manage content changes in a wiki-like style. Also in http://jlord.us/forkngo/ there's several tools for static websites, the most interesting one http://jlord.us/sheetsee.js/ that uses Google Spreadsheets as a database, that could be used for the comments :-)

Ok, https://staticman.net/ automatically modify data files in Jekyll static sites, that's exactly what were needing :-)