RedBeardLab / rediSQL

Redis module that provides a completely functional SQL database

Home Page:https://redisql.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

redis bgsave run 48 hours ,but no data in the memory was dumped into the .rdb file

mengpengfei opened this issue · comments

image
i load redisql into redis-6.0.6
redis bgsave thread run 48 hours after batch insert 2 000 000 0 item data(insert into xxx values(,),(,)),but no data in the memory was dumped into the .rdb file
if i restart redis-server,loss most of the data in memory

if i kill the bgsave-thread,then 2 file will be created.
rediSQL_rdb_write_f6f4e329-f9ca-445b-9341-763209b1654e.sqlite
rediSQL_rdb_write_f6f4e329-f9ca-445b-9341-763209b1654e.sqlite-journal

and the another bgsave thread will start
the last bgsave thread save memory data to .rgb file successfully

redisql must have a critical issue about bgsave

What version of RediSQL are you using?

We do have tests to account for this, so it should not happen.

So, you start a BGSAVE, it run forever, and so you decided to kill it.
Then another thread starts, and the second one successfully save the data.

It is correct?

Feel free to reach me by email.

I am using this version :
https://github.com/RedBeardLab/rediSQL/files/3504726/redisql_1.1.2.tar.gz

yes,it is correct. the first bgsave thread run forever.

I need to transfer the data of sqlite table to the table of redisql.
the table of sqlite has 2000 0000+ rows.

when i insert the table of redisql little data,no errors accoured.
when i batch insert the data into a table of redisql,the bgsave thread run forever. but no data in the memory was dumped into the .rdb file

Hummm, I tried to re-produce the problem using the latest version. And it was not there.

But I didn't touch that code. So more investigation is needed.

How did you created the database?

Also, how many rows are in the table and how big they are?

Thank you for your answer.
The 2 command that i had used to create the database and the table:
REDISQL.CREATE_DB TMC
REDISQL.EXEC TMC "CREATE TABLE tmctable (TileId INTEGER ,RoadId INTEGER,TmcId INTEGER ,IsGeoLine INTEGER ,Heading REAL ,StartShapeX REAL,StartShapeY REAL,EndShapeX REAL,EndShapeY REAL,BBLBX REAL,BBLBY REAL,BBRTX REAL,BBRTY REAL,Direction INTEGER,LinkType INTEGER,Flag INTEGER,UpdateTime INTEGER,seqId INTEGER,PRIMARY KEY (seqId) );"

Rows to insert to the table is 25 000 000.

Can you use the follow version to test?

https://github.com/RedBeardLab/rediSQL/files/3504726/redisql_1.1.2.tar.gz

I compiled the source of v1.1.2 and test it,
the issuse accours when i inserted 5000 0000+ rows to rediSQL
I batch inserted rediSQL.
the batch size is 10000

Does the issue appear with the latest version?

The command are all the same, but you need to add a V1 between REDISQL and $COMMAND_NAME so it will be like: REDISQL.V1.EXEC

yes,it had appeared.
can the command "REDISQL.V1.EXEC" avoid the issue?

Latest version is the version 2, not 1.1.2, so the one from master.

So you are saying that this happens also if you use the master version, correct?

Unfortunately I don't believe that the way we insert elements has anything to do with the way the RDB database is created. I need to take an hard look at the issue.

If you want to help I can guide you on the investigation.

sorry,I had not use the master version,I had tested on the 1.1.2 version

I am interesting in the project,but i can not code in c,i only code in java or python

Aw I see!

I am a little tight not time at the moment! I need to prioritize what I can do with my time.

This is the curse of open source, I cannot invest in it, unless I have some income source from it...

I will try to do my best...

sorry, thanks,can i use redisql on redis cluster?

Hi!

Yes you can, but it will partition by database.

So if you got multiple databases, the one you create with REDISQL.CREATE_DB DB1, those are partitioned.

If you push all your data into a single database, you won't get any benefit from the cluster.

thank you very much,finally i solve the problem by add some memory to the physical server.
perhaps when the memory of server is not enough,the issue occours

That it is very good to hear!

It should stream to disk, but of course we need memory to keep bookeping of everything.