Storing a 7millions keys python dictionary in a database -


i have handle 7 millions keys dictionary (the number of keys can ~50 millions). since have barely enough ram keep in memory i've decided store it.

my dictionary looks this:

dictionary={(int1,int2):int3,...} 

first tried store in sqlite database using sqlite3. amount of time required store ok (around 70 secs). using timeit:

>>>import sqlite3 >>>conn=sqlite3.connect('test_sqlite.sqlite') >>>c=conn.cursor() >>>c.execute('create table test (int1 int, int2 int, int3 int)') >>>conn.commit() >>>conn.close() >>>import timeit >>>timeit.timeit('c.executemany("insert test values (?,?,?)",((key[0],key[1],dictionary[key]) key in dictionary.iterkeys())),setup='import sqlite3;conn=sqlite3.connect("test_sqlite.sqlite");c=conn.cursor();dictionary={(i,i+1):i+2 in xrange(7000000)}',number=1) 70.7033872604 

but then, need use stored dictionary in order retrieve values, each select seems take approximately 1.5 secs. since need access around 1 million values discouraging:

>>>timeit.timeit('c.execute("select id1 test id2=={}".format(value)).fetchone()[0]',setup=import sqlite3;conn=sqlite3.connect("test_sqlite.sqlite");c=conn.cursor();value=5555',number=1) 1.5300869941711426 

then tried update dictionary in shelf. amount of time value in shelved dictionary good:

>>> timeit.timeit('a=f[key]',setup='import shelve;f=shelve.open("test_timeit","r");key="1000"',number=10000) 0.320019006729126 

so though several millions requests one, total amount of time should around hundred of secs.

but new problem arose, time required store dictionary in shelf doesn't satisfie me.

>>> timeit.timeit('f.update(dictio)',setup='import shelve;f=shelve.open("test_timeit","c");dictio={"({},{})".format(i,i+1):i+2 in xrange(7000000)}',number=1) 504.728841782 

one must add amount time time required convert former keys (which tuples) string. using repr:

>>>timeit.timeit('repr.repr((1,2))',setup='import repr',number=7000000) 61.6035461426 

which makes total of 566.332387924 update dictionary shelf ...

i don't want pickle dictionary, since implies i'll have load whole dictionary if want use later.

is there way can improve 1 of these 2 methods in order have better access times/loading times ?

thanks !

for queries on large tables return quickly, need index relevant columns. in case add primary key.

create table test (     int1 integer,     int2 integer,     int3 integer,     primary key (int1, int2) ) 

Comments

Popular posts from this blog

c++ - Creating new partition disk winapi -

Android Prevent Bluetooth Pairing Dialog -

php - joomla get content in onBeforeCompileHead function -