dict is really slow for big truck
i try to load a big file into a dict, which is about 9,000,000 lines,
something like
1 2 3 4
2 2 3 4
3 4 5 6
code
for line in open(file)
arr=line.strip().split('\t')
dict[arr[0]]=arr
but, the dict is really slow as i load more data into the memory, by
the way the mac i use have 16G memory.
is this cased by the low performace for dict to extend memory or
something other reason.
is there any one can provide a better solution
--
http://mail.python.org/mailman/listinfo/python-list
dict would be very slow for big data
hi i am trying to insert a lot of data into a dict, which may be 10,000,000 level. after inserting 10 unit, the insert rate become very slow, 50,000/ s, and the entire time used for this task would be very long,also. would anyone know some solution for this case? thanks -- http://mail.python.org/mailman/listinfo/python-list
