martinvonz / jj

A Git-compatible VCS that is both simple and powerful

Home Page:https://martinvonz.github.io/jj/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

FR: allow restoring repo without snapshot

0xdeafbeef opened this issue Β· comments

Is your feature request related to a problem? Please describe.
Your app accidentally created a billion files, and you want to restore the repository to a clean state.
If one of the files is larger than snapshot.max-new-file-size, you're in trouble.
You can either snapshot it to delete or do it by hand, both of which are cumbersome.

 cd $(mktemp -d)
 jj git init .
 echo a > a
 jj describe -m 'init'
 jj new
 dd if=/dev/urandom of=rand_10m bs=1M count=10
 jj restore .
#Error: Failed to snapshot the working copy
#The file '/tmp/tmp.CQHWzRWxd6/rand_10m' is too large to be snapshotted: it is 10.0MiB; the maximum size allowed #is ~1.0MiB.
#Hint: This is to prevent large files from being added on accident. You can fix this error by:
#  - Adding the file to `.gitignore`
#  - Run `jj config set --repo snapshot.max-new-file-size 10485760`
#    This will increase the maximum file size allowed for new files, in this repository only.
#  - Run `jj --config-toml 'snapshot.max-new-file-size=10485760' st`
#    This will increase the maximum file size allowed for new files, for this command only.

Describe the solution you'd like
jj restore --force, which will skip snapshotting of repo. Maybe we can introduce new jj reset subcommand.

Have you tried jj restore . --ignore-working-copy?

Have you tried jj restore . --ignore-working-copy?

❯ jj restore . --ignore-working-copy
Nothing changed.

/tmp/tmp.fmt69QzBUM [🍐 kxrwyuon 2d9285c5 on <no branch> πŸ“]
❯ ls
a  rand_10m

iirc, there are some discussion about demoting the max-new-file-size error to warning. It will allow jj restore in that situation. I'm not sure if that's a good idea because doing that can be a footgun.

iirc, there are some discussion about demoting the max-new-file-size error to warning. It will allow jj restore in that situation. I'm not sure if that's a good idea because doing that can be a footgun.

Perhaps it would be feasible to actually remove newly created files when --ignore-working-copy is set?

Perhaps it would be feasible to actually remove newly created files when --ignore-working-copy is set?

--ignore-working-copy is the option to not touch working-copy files. It seems scarier than allowing jj restore in dirty working copy.

maybe jj restore --untracked which would get rid of all files that are not tracked or ignored?

Maybe we could even extend the error and hint to include suggestions for commands to:
a. increase the max file size
b. ignore the file
c. restore untracked files

maybe jj restore --untracked which would get rid of all files that are not tracked or ignored?

Yeah, hg purge/git clean-like command is also an option. #3154
(but we'll need to deal with the max-new-file-size error somehow because jj purge --ignore-working-copy doesn't make sense.)

Another option might be some new global flag that's similar to --ignore-working-copy but instead of leaving the working copy stale, it updates the working state to the new target commit without touching the files in the working copy (calling WorkingCopy::reset()).

I think jj purge is a good idea and the right solution to the "My app created too many files" problem. I regularly use git clean -xfd in order to purge working directories of stuff (e.g. delete all the build artifacts from make so you can do a complete clean rebuild.) However, this requires a lot of care to make sure you're not in the wrong state. (Normally, we would be safe in these cases thanks to autosnapshots, but of course they're broken.)

There's another problem that isn't spelt out directly in the OP. From a UX perspective, it's important to know what you're going to destroy first, but that currently isn't possible, because jj st is broken by the very thing you're trying to fix. jj st should not fail in the case of a max file snapshot error. Actually, it should probably always try to succeed, despite outright corruption. It's one of the most important tools to understand the repository state. Destroying it because a tool made a file too large is not great.

Instead, it should report that some files are ?? or something, to indicate that these files are not part of the snapshot. So something like the following:

$ jj st
Working copy changes:
A  foobar
M  barbaz
R  bazbaz
?? too-large.txt

So I think we need to fix both of these for these cases to not hurt so badly.