panda-re / lava

LAVA: Large-scale Automated Vulnerability Addition

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Lava Dockerfile Fatal Error: host.json not found. Copy host.json.example to host.json

Kigorky opened this issue · comments

Hi,

I have been trying to use lava but i am unable to do it:

First cloning the repo i run the setup.py and i got the following dependencies errors:

[setup.py] Installing LAVA apt-get dependencies

[setup.py] Running [sudo apt-get -y install libjsoncpp-dev postgresql jq python-psycopg2 python-sqlalchemy socat libpq-dev cmake docker.io bc python-pexpect python-psutil python-lockfile genisoimage inotify-tools build-essential python-pip libprotobuf-c0-dev libodb-pgsql-2.4 libfdt-dev] . . .
Reading package lists... Done
Building dependency tree       
Reading state information... Done
Package python-psycopg2 is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or
is only available from another source

Package python-pip is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or
is only available from another source
However the following packages replace it:
  python3-pip

E: Package 'python-psycopg2' has no installation candidate
E: Package 'python-pip' has no installation candidate
E: Unable to locate package libprotobuf-c0-dev

[setup.py] [sudo apt-get -y install libjsoncpp-dev postgresql jq python-psycopg2 python-sqlalchemy socat libpq-dev cmake docker.io bc python-pexpect python-psutil python-lockfile genisoimage inotify-tools build-essential python-pip libprotobuf-c0-dev libodb-pgsql-2.4 libfdt-dev] cmd did not execute properly.

Furthermore i tried to use the dockerfile to build the docker image, ran the configure script in the /libodb-pgsql-2.4.0 directory and cloned the lava repo whitin the docker container. When i run the following command:
$ ./scripts/lava.sh toy
i got:
[everything] Parsing args [everything] All steps will be executed Fatal error: host.json not found. Copy host.json.example to host.json

So can you help me please ?

Thank you.

I'm not sure about the first one (my guess is that you are trying this on a newer version of Ubuntu, and some older versions of Python have been removed), but for the second issue, did you try doing what the error message says?

[everything] Parsing args [everything] All steps will be executed Fatal error: host.json not found. Copy host.json.example to host.json

In other words, try copying host.json.example to host.json ?

I'm not sure about the first one (my guess is that you are trying this on a newer version of Ubuntu, and some older versions of Python have been removed), but for the second issue, did you try doing what the error message says?

[everything] Parsing args [everything] All steps will be executed Fatal error: host.json not found. Copy host.json.example to host.json

In other words, try copying host.json.example to host.json ?

The host.json.example file is within the lava directory and that is ok, but where the host.json file is ? By searching for it using the command $ find / -type f -name "host.json", there is no output. So it seems that is not in the docker container.

Right - the error message is telling you that you need to copy host.json.example to host.json. As in:

cp host.json.example host.json

Right - the error message is telling you that you need to copy host.json.example to host.json. As in:

cp host.json.example host.json

i had done it however i got the following output:
[everything] Parsing args
[everything] All steps will be executed
Fatal error: /home/fasano/lava/target_configs/toy/toy.json not found. Did you provide the right project name?

I think the script is searching for the toy.json file in the wrong path. By cloning the repo from github the right path of the toy.json file is:
/lava/target_configs/toy

but it is searching for it in /home/fasano/lava/target_configs/toy/toy.json

I edited the scripts and now everything seems to be ok ;) thank you

Glad to hear it! I'll close this issue.