User Profile

Collapse

Profile Sidebar

Collapse
fekioh
fekioh
Last Activity: Aug 16 '10, 09:33 PM
Joined: Aug 6 '10
Location:
  •  
  • Time
  • Show
  • Source
Clear All
new posts

  • fekioh
    replied to Large lists in python
    Thank you, i will look into this tomorrow and I'll post back if in trouble..

    Cheers!
    See more | Go to post

    Leave a comment:


  • fekioh
    replied to Large lists in python
    Also, not very familiar with MySQL. Is there no alternative "large list implementation" of say storing on disk and loading in RAM a chunk ("page") at a time.
    See more | Go to post

    Leave a comment:


  • fekioh
    replied to Large lists in python
    Hmm, sorry I wasn't very clear. What I meant is:

    (i) the files contain ~ month long measurements and I'd like to be able when I've read a file in to have e.g. per-day or per-week means. Or for a specific file to focus on the first hours. So that's what I meant I don't want to read the whole thing over and over again...

    (ii) as for the last line, I guess it's not a big issue. I just need to know the duration of all measurements...
    See more | Go to post

    Leave a comment:


  • fekioh
    replied to Large lists in python
    Yes, I thought of that. Problem is: (i) I need to be able to calculate statistics for different block sizes without having to read the file over and over again and (ii) I need to know some info from the very last line (files have a time-column, started the same time but are not equally long).

    Is there any way to store the whole thing in some kind of data structure (e.g. to create a class "extending" list or something?) Sorry...
    See more | Go to post

    Leave a comment:


  • fekioh
    started a topic Large lists in python

    Large lists in python

    Hello,

    I need to store data in large lists (~e7 elements) and I often get a memory error in code that looks like:

    Code:
    f = open('data.txt','r')
    for line in f:
        list1.append(line.split(',')[1])
        list2.append(line.split(',')[2])
        # etc.

    I get the error when reading-in the data, but I don't really need all elements to be stored in RAM all the time. I work with chunks of...
    See more | Go to post
    Last edited by bvdet; Aug 14 '10, 03:22 PM. Reason: Add code tags

  • fekioh
    replied to how to release memory?
    thanx a lot!
    See more | Go to post

    Leave a comment:


  • fekioh
    replied to how to release memory?
    ok here's the code.

    The first are functions which you probably dont need to look at all. The for loop in question starts at line 180. Until line 215 I just read the first line of each file to see which columns I will need to use and then i store the columns in lists.

    The final part (lines 240 to the end) I do some calculations on subsets of the columns and right the results to a file..

    Thanks a lot!...
    See more | Go to post

    Leave a comment:


  • fekioh
    replied to how to release memory?
    This must be quite straightforward . I guess I am not seeing sth.

    I had already tried gc.enable()... Now I tried print(gc.collec t())
    print(gc.collec t())


    at the end of each for loop
    and it printed:
    90
    0

    but still memory error next loop.
    See more | Go to post

    Leave a comment:


  • fekioh
    replied to how to release memory?
    Thank you but still no light... All files are closed yes. And I don't think there are any cyclic refs. Is there a way to check what is still actually stored in memory when the first loop finishes?
    See more | Go to post

    Leave a comment:


  • fekioh
    replied to how to release memory?
    tried it but it does not work still. Very annoying that there is always a mistake after first file. And I have like 400 files. I cannot do them one by one...

    Maybe I must do something in the for loop to free the used memory after reading the first file?
    See more | Go to post

    Leave a comment:


  • fekioh
    started a topic how to release memory?

    how to release memory?

    I am new in Python. I wrote a program that reads-in large text files using xreadlines, stores some values in lists (and arrays) and does some calculations. It runs fine for individual files, but when I try to consecutively process files from a folder, I get a memory error.

    my program looks like this:
    Code:
    data = fileName.xreadlines()
    for line in data:
       tokens = line.split(';')
       list1.append(tokens[2])
    ...
    See more | Go to post
    Last edited by bvdet; Aug 7 '10, 01:42 PM. Reason: Add code tags
No activity results to display
Show More
Working...