Fail to backup using Github actions
KengChiChang opened this issue · comments
Keng-Chi Chang commented
Describe the bug
Backup fails with this error while using github actions
2020-11-12T23:08:48.5285336Z ##[group]Run roam-to-git --skip-git .
2020-11-12T23:08:48.5285894Z �[36;1mroam-to-git --skip-git .�[0m
2020-11-12T23:08:48.5326962Z shell: /bin/bash -e {0}
2020-11-12T23:08:48.5327274Z env:
2020-11-12T23:08:48.5327742Z pythonLocation: /opt/hostedtoolcache/Python/3.8.6/x64
2020-11-12T23:08:48.5328820Z ROAMRESEARCH_USER: ***
2020-11-12T23:08:48.5329315Z ROAMRESEARCH_PASSWORD: ***
2020-11-12T23:08:48.5329815Z ROAMRESEARCH_DATABASE: ***
2020-11-12T23:08:48.5330195Z ##[endgroup]
2020-11-12T23:08:48.9340664Z 2020-11-12 23:08:48.933 | DEBUG | roam_to_git.__main__:main:53 - No secret found at /home/runner/work/notes-roam/notes-roam/.env
2020-11-12T23:08:48.9353107Z 2020-11-12 23:08:48.935 | DEBUG | roam_to_git.scrapping:download_rr_archive:55 - Creating browser
2020-11-12T23:08:48.9449641Z [W:pyppeteer.chromium_downloader] start chromium download.
2020-11-12T23:08:48.9450680Z Download may take a few minutes.
2020-11-12T23:08:49.0969896Z
2020-11-12T23:08:49.1986079Z 0%| | 0/108773488 [00:00<?, ?it/s]
2020-11-12T23:08:49.2986105Z 4%|▍ | 4853760/108773488 [00:00<00:02, 47930727.16it/s]
2020-11-12T23:08:49.3986796Z 19%|█▊ | 20244480/108773488 [00:00<00:01, 60407844.41it/s]
2020-11-12T23:08:49.4994472Z 26%|██▌ | 28231680/108773488 [00:00<00:01, 65164585.07it/s]
2020-11-12T23:08:49.5996946Z 43%|████▎ | 46407680/108773488 [00:00<00:00, 80603191.02it/s]
2020-11-12T23:08:49.7005506Z 59%|█████▉ | 64614400/108773488 [00:00<00:00, 96782283.41it/s]
2020-11-12T23:08:49.8057642Z 71%|███████ | 76963840/108773488 [00:00<00:00, 103236862.55it/s]
2020-11-12T23:08:49.9059506Z 82%|████████▏ | 89282560/108773488 [00:00<00:00, 107001789.11it/s]
2020-11-12T23:08:49.9218682Z 97%|█████████▋| 105256960/108773488 [00:00<00:00, 118752483.07it/s]
2020-11-12T23:08:49.9221485Z 100%|██████████| 108773488/108773488 [00:00<00:00, 131896082.17it/s]
2020-11-12T23:08:49.9222334Z [W:pyppeteer.chromium_downloader]
2020-11-12T23:08:49.9223144Z chromium download done.
2020-11-12T23:08:52.2892863Z [W:pyppeteer.chromium_downloader] chromium extracted to: /home/runner/.local/share/pyppeteer/local-chromium/588429
2020-11-12T23:08:52.7936704Z 2020-11-12 23:08:52.790 | DEBUG | roam_to_git.scrapping:download_rr_archive:55 - Creating browser
2020-11-12T23:08:53.3440811Z 2020-11-12 23:08:53.343 | DEBUG | roam_to_git.scrapping:_download_rr_archive:92 - Configure downloads to /tmp/tmpi9jnfg0t
2020-11-12T23:08:53.3557366Z 2020-11-12 23:08:53.355 | DEBUG | roam_to_git.scrapping:_download_rr_archive:92 - Configure downloads to /tmp/tmpud7xngpv
2020-11-12T23:08:53.3790443Z 2020-11-12 23:08:53.378 | DEBUG | roam_to_git.scrapping:signin:185 - Opening signin page
2020-11-12T23:08:53.3920095Z 2020-11-12 23:08:53.391 | DEBUG | roam_to_git.scrapping:signin:185 - Opening signin page
2020-11-12T23:08:58.4591233Z 2020-11-12 23:08:58.458 | DEBUG | roam_to_git.scrapping:signin:189 - Fill email '***'
2020-11-12T23:08:58.5449393Z 2020-11-12 23:08:58.544 | DEBUG | roam_to_git.scrapping:signin:189 - Fill email '***'
2020-11-12T23:09:02.4789398Z 2020-11-12 23:09:02.478 | DEBUG | roam_to_git.scrapping:signin:194 - Fill password
2020-11-12T23:09:02.7633180Z 2020-11-12 23:09:02.762 | DEBUG | roam_to_git.scrapping:signin:194 - Fill password
2020-11-12T23:09:03.2010520Z 2020-11-12 23:09:03.200 | DEBUG | roam_to_git.scrapping:signin:199 - Click on sign-in
2020-11-12T23:09:03.4832219Z 2020-11-12 23:09:03.482 | DEBUG | roam_to_git.scrapping:signin:199 - Click on sign-in
2020-11-12T23:09:05.4378248Z 2020-11-12 23:09:05.437 | DEBUG | roam_to_git.scrapping:go_to_database:209 - Load database from url 'https://roamresearch.com/#/app/***'
2020-11-12T23:09:05.7155422Z 2020-11-12 23:09:05.714 | DEBUG | roam_to_git.scrapping:go_to_database:209 - Load database from url 'https://roamresearch.com/#/app/***'
2020-11-12T23:09:05.7300571Z 2020-11-12 23:09:05.729 | DEBUG | roam_to_git.scrapping:_download_rr_archive:102 - Wait for interface to load
2020-11-12T23:09:05.9650796Z 2020-11-12 23:09:05.964 | DEBUG | roam_to_git.scrapping:_download_rr_archive:102 - Wait for interface to load
2020-11-12T23:09:08.2471949Z 2020-11-12 23:09:08.246 | DEBUG | roam_to_git.scrapping:_download_rr_archive:130 - Launch download popup
2020-11-12T23:09:08.4014510Z 2020-11-12 23:09:08.400 | DEBUG | roam_to_git.scrapping:download_rr_archive:76 - Closing browser markdown
2020-11-12T23:09:08.4189149Z 2020-11-12 23:09:08.417 | DEBUG | roam_to_git.scrapping:_download_rr_archive:130 - Launch download popup
2020-11-12T23:09:08.4717014Z 2020-11-12 23:09:08.470 | DEBUG | roam_to_git.scrapping:download_rr_archive:78 - Closed browser markdown
2020-11-12T23:09:08.4799471Z 2020-11-12 23:09:08.472 | ERROR | __main__:<module>:33 - An error has been caught in function '<module>', process 'MainProcess' (2868), thread 'MainThread' (140282056095552):
2020-11-12T23:09:08.4800575Z Traceback (most recent call last):
2020-11-12T23:09:08.4801508Z > File "/opt/hostedtoolcache/Python/3.8.6/x64/bin/roam-to-git", line 33, in <module>
2020-11-12T23:09:08.4802536Z sys.exit(load_entry_point('roam-to-git==0.1', 'console_scripts', 'roam-to-git')())
2020-11-12T23:09:08.4803535Z │ │ └ <function importlib_load_entry_point at 0x7f95f4af34c0>
2020-11-12T23:09:08.4804256Z │ └ <built-in function exit>
2020-11-12T23:09:08.4804834Z └ <module 'sys' (built-in)>
2020-11-12T23:09:08.4806607Z File "/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/site-packages/roam_to_git/__main__.py", line 76, in main
2020-11-12T23:09:08.4807945Z scrap(markdown_zip_path, json_zip_path, config)
2020-11-12T23:09:08.4809039Z │ │ │ └ <roam_to_git.scrapping.Config object at 0x7f95f03189d0>
2020-11-12T23:09:08.4809887Z │ │ └ PosixPath('/tmp/tmpud7xngpv')
2020-11-12T23:09:08.4810612Z │ └ PosixPath('/tmp/tmpi9jnfg0t')
2020-11-12T23:09:08.4811293Z └ <function scrap at 0x7f95eeac8a60>
2020-11-12T23:09:08.4812392Z File "/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/site-packages/roam_to_git/scrapping.py", line 254, in scrap
2020-11-12T23:09:08.4813339Z asyncio.get_event_loop().run_until_complete(asyncio.gather(*tasks))
2020-11-12T23:09:08.4814572Z │ │ │ │ └ [<coroutine object download_rr_archive at 0x7f95eee2ff40>, <coroutine object download_rr_archive at 0x7f95ef325ec0>]
2020-11-12T23:09:08.4815587Z │ │ │ └ <function gather at 0x7f95f0f2e4c0>
2020-11-12T23:09:08.4816893Z │ │ └ <module 'asyncio' from '/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/asyncio/__init__.py'>
2020-11-12T23:09:08.4817803Z │ └ <built-in function get_event_loop>
2020-11-12T23:09:08.4818731Z └ <module 'asyncio' from '/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/asyncio/__init__.py'>
2020-11-12T23:09:08.4819736Z File "/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
2020-11-12T23:09:08.4820487Z return future.result()
2020-11-12T23:09:08.4821233Z │ └ <method 'result' of '_asyncio.Future' objects>
2020-11-12T23:09:08.4822273Z └ <_GatheringFuture finished exception=ValueError('not enough values to unpack (expected 1, got 0)')>
2020-11-12T23:09:08.4823725Z File "/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/site-packages/roam_to_git/scrapping.py", line 69, in download_rr_archive
2020-11-12T23:09:08.4824771Z return await _download_rr_archive(document, output_type, output_directory, config)
2020-11-12T23:09:08.4825865Z │ │ │ │ └ <roam_to_git.scrapping.Config object at 0x7f95f03189d0>
2020-11-12T23:09:08.4826743Z │ │ │ └ PosixPath('/tmp/tmpi9jnfg0t')
2020-11-12T23:09:08.4827396Z │ │ └ 'markdown'
2020-11-12T23:09:08.4828227Z │ └ <pyppeteer.page.Page object at 0x7f95ee388160>
2020-11-12T23:09:08.4829301Z └ <function _download_rr_archive at 0x7f95eeac8820>
2020-11-12T23:09:08.4836490Z File "/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/site-packages/roam_to_git/scrapping.py", line 132, in _download_rr_archive
2020-11-12T23:09:08.4837900Z export_all, = [b for b in divs_pb3 if await get_text(document, b) == 'export all']
2020-11-12T23:09:08.4838875Z │ │ └ <pyppeteer.page.Page object at 0x7f95ee388160>
2020-11-12T23:09:08.4839753Z │ └ <function get_text at 0x7f95f03193a0>
2020-11-12T23:09:08.4840730Z └ [<pyppeteer.element_handle.ElementHandle object at 0x7f95ed6be430>]
2020-11-12T23:09:08.4841298Z
2020-11-12T23:09:08.4841763Z ValueError: not enough values to unpack (expected 1, got 0)
2020-11-12T23:09:08.4842416Z Traceback (most recent call last):
2020-11-12T23:09:08.4843487Z File "/opt/hostedtoolcache/Python/3.8.6/x64/bin/roam-to-git", line 33, in <module>
2020-11-12T23:09:08.4844507Z sys.exit(load_entry_point('roam-to-git==0.1', 'console_scripts', 'roam-to-git')())
2020-11-12T23:09:08.4845772Z File "/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/site-packages/loguru/_logger.py", line 1149, in catch_wrapper
2020-11-12T23:09:08.4846559Z return function(*args, **kwargs)
2020-11-12T23:09:08.4847567Z File "/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/site-packages/roam_to_git/__main__.py", line 76, in main
2020-11-12T23:09:08.4848387Z scrap(markdown_zip_path, json_zip_path, config)
2020-11-12T23:09:08.4849484Z File "/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/site-packages/roam_to_git/scrapping.py", line 254, in scrap
2020-11-12T23:09:08.4850452Z asyncio.get_event_loop().run_until_complete(asyncio.gather(*tasks))
2020-11-12T23:09:08.4851425Z File "/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
2020-11-12T23:09:08.4852170Z return future.result()
2020-11-12T23:09:08.4853294Z File "/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/site-packages/roam_to_git/scrapping.py", line 69, in download_rr_archive
2020-11-12T23:09:08.4854305Z return await _download_rr_archive(document, output_type, output_directory, config)
2020-11-12T23:09:08.4855631Z File "/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/site-packages/roam_to_git/scrapping.py", line 132, in _download_rr_archive
2020-11-12T23:09:08.4860395Z export_all, = [b for b in divs_pb3 if await get_text(document, b) == 'export all']
2020-11-12T23:09:08.4861144Z ValueError: not enough values to unpack (expected 1, got 0)
2020-11-12T23:09:08.4864002Z 2020-11-12 23:09:08.485 | DEBUG | roam_to_git.scrapping:_kill_child_process:217 - Terminate child process [psutil.Process(pid=2907, name='chrome', status='sleeping', started='23:08:52'), psutil.Process(pid=2918, name='chrome', status='sleeping', started='23:08:52'), psutil.Process(pid=2937, name='chrome', status='sleeping', started='23:08:52'), psutil.Process(pid=2948, name='chrome', status='sleeping', started='23:08:52'), psutil.Process(pid=2920, name='chrome', status='sleeping', started='23:08:52'), psutil.Process(pid=2936, name='chrome', status='running', started='23:08:52')]
2020-11-12T23:09:08.5773166Z ##[error]Process completed with exit code 1.
Erik Newhard commented
Does this happen every hour or only occasionally?
Keng-Chi Chang commented
Does this happen every hour or only occasionally?
Every hour :(
Keng-Chi Chang commented
Okay now it starts working (although it fails 6 times out of 24 times in the past 24 hours).