zoni / obsidian-export

Rust library and CLI to export an Obsidian vault to regular Markdown

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Recursion limit exceeded

cristianvasquez opened this issue · comments

Hello @zoni, thanks for this plugin!. It's amazing.

My report:

When exporting the note:

Transcluding.md, with the contents:

The action of ![[Transcluding]]

I get:

Error: Failed to export '/home/cvasquez/obsidian/public-garden/Transcluding.md'

Caused by:
    Recursion limit exceeded

Location:
    /home/cvasquez/.cargo/registry/src/github.com-1ecc6299db9ec823/obsidian-export-0.2.0/src/main.rs:45:19

Hey there @cristianvasquez!

To make sure I understand correctly:

  1. You have a note Transcluding.md
  2. This note includes itself (it contains ![[Transcluding]])
  3. You run into the Recursion limit exceeded error when trying to export a vault with this note.

Is that correct? If so, this is by design. A note which includes itself would otherwise lead to infinite recursion (which would eventually cause the program to abort with a panic).

I'm not sure off-hand how Obsidian itself handles this (I think it embeds about 7 iframes and then just stops?) but I chose to make this an explicit error to make a note like this very obvious.

Would you prefer to see a different behavior here?

I don't know what should be expected. But this behavior forced me to change my notes, otherwise, I cannot export.

This happens if there is any 'closed circuit' in notes that expand.

By the way, why it expands the notes?

Perhaps some sort of .gitignore but for the export would allow ignoring,

Or a fail silently option?

This happens if there is any 'closed circuit' in notes that expand.

By this, do you mean note A which transcludes note B, where note B in turn transcludes note A again?

By the way, why it expands the notes?

I'm afraid I don't understand this question, could you elaborate?

Perhaps some sort of .gitignore but for the export would allow ignoring,

This is already supported, see https://github.com/zoni/obsidian-export#ignoring-files 😄

Or a fail silently option?

I think I'd like to understand the use-case slightly better before adding functionality, but I wouldn't be opposed to implementing something like that.

Oh!... the ignore option was what I looked for :)

Regarding the question, I still don't get why a Recursion limit happened :) shouldn't each note be processed only once?

shouldn't each note be processed only once?

The current implementation is somewhat naive in that regard.

This function is responsible for parsing an Obsidian Markdown file:

fn parse_obsidian_note<'a>(path: &Path, context: &Context) -> Result<MarkdownTree<'a>> {

If this encounters the ![[embed]] syntax, it calls embed_file:

obsidian-export/src/lib.rs

Lines 301 to 303 in 7027290

Event::Text(CowStr::Borrowed("![")) => {
let mut elements = embed_file(&text, &context)?;
tree.append(&mut elements);

embed_file, in turns, calls the parse_obsidian_note function from above again if it encounters another Markdown file:

Some("md") => parse_obsidian_note(&path, &context)?,

This is what allows a loop to be created.

However, I'm already passing through a context struct to hold information about the note that is being processed:

#[derive(Debug, Clone)]
/// Context holds parser metadata for the file/note currently being parsed.
struct Context<'a> {
file: PathBuf,
vault_contents: &'a [PathBuf],
frontmatter_strategy: FrontmatterStrategy,
note_depth: u32,
}

Technically, I would be able to keep track of all files that we've processed up to the root note, which would allow implementing the exact behavior you mention (skip a note if it's already been processed). I may play around with that and implement this (possibly as an optional mode) somewhere over the next couple of weeks. Stay tuned 😄

Hey @cristianvasquez, happy new year. 😃

I've just published v0.5.0 which includes a new option, --no-recursive-embeds, to skip processing files more than once within the same note. I believe this will give you the behavior you were looking for.

happy new year to you @zoni ! Thanks for the new option :)