Thank you, i will look into this tomorrow and I'll post back if in trouble..
Cheers!
User Profile
Collapse
-
Also, not very familiar with MySQL. Is there no alternative "large list implementation" of say storing on disk and loading in RAM a chunk ("page") at a time.Leave a comment:
-
Hmm, sorry I wasn't very clear. What I meant is:
(i) the files contain ~ month long measurements and I'd like to be able when I've read a file in to have e.g. per-day or per-week means. Or for a specific file to focus on the first hours. So that's what I meant I don't want to read the whole thing over and over again...
(ii) as for the last line, I guess it's not a big issue. I just need to know the duration of all measurements...Leave a comment:
-
Yes, I thought of that. Problem is: (i) I need to be able to calculate statistics for different block sizes without having to read the file over and over again and (ii) I need to know some info from the very last line (files have a time-column, started the same time but are not equally long).
Is there any way to store the whole thing in some kind of data structure (e.g. to create a class "extending" list or something?) Sorry...Leave a comment:
-
Large lists in python
Hello,
I need to store data in large lists (~e7 elements) and I often get a memory error in code that looks like:
Code:f = open('data.txt','r') for line in f: list1.append(line.split(',')[1]) list2.append(line.split(',')[2]) # etc.
I get the error when reading-in the data, but I don't really need all elements to be stored in RAM all the time. I work with chunks of... -
-
ok here's the code.
The first are functions which you probably dont need to look at all. The for loop in question starts at line 180. Until line 215 I just read the first line of each file to see which columns I will need to use and then i store the columns in lists.
The final part (lines 240 to the end) I do some calculations on subsets of the columns and right the results to a file..
Thanks a lot!...Leave a comment:
-
This must be quite straightforward . I guess I am not seeing sth.
I had already tried gc.enable()... Now I tried print(gc.collec t())
print(gc.collec t())
at the end of each for loop
and it printed:
90
0
but still memory error next loop.Leave a comment:
-
Thank you but still no light... All files are closed yes. And I don't think there are any cyclic refs. Is there a way to check what is still actually stored in memory when the first loop finishes?Leave a comment:
-
tried it but it does not work still. Very annoying that there is always a mistake after first file. And I have like 400 files. I cannot do them one by one...
Maybe I must do something in the for loop to free the used memory after reading the first file?Leave a comment:
-
how to release memory?
I am new in Python. I wrote a program that reads-in large text files using xreadlines, stores some values in lists (and arrays) and does some calculations. It runs fine for individual files, but when I try to consecutively process files from a folder, I get a memory error.
my program looks like this:
...Code:data = fileName.xreadlines() for line in data: tokens = line.split(';') list1.append(tokens[2])
No activity results to display
Show More
Leave a comment: