c# - Too "big" Arrays. Do I need a DataBase? Or are there other options? -


i encountered outofmemory error in program due (i guess) excessive use of arrays , double multidimensional arrays (one around 5000x60 elements). i'm searching alternative storing information.

are databases alternative? or there other way store information arrays can free memory? read possibility of manually allocating memory (or splitting in chunks? don't recall was) that's not option guess because arrays , objects grow more in size sooner or later i'd reach (in theory) limits of physical ram. although might wrong here.

i'm not asking huge code examples, links tutorials or reads (books ok although should able order them in germany without billions of dollars shipping costs ;-)) perfect.

also: no, never did databases i'm willing learn anything.

thanks in advance help.

edit: fixed "double array" "multidimensional array".

edit2: thought there might "universal" solution problem huge arrays prooved me wrong. because it's hard find alternative if people don't know i'm doing here's quick overview of i'm doing (it's statistical purposes):

  • read data csv file, each line resembles custom object myclass1 containing around 60 objects myclass2.
  • iterate through instances of myclass1, adding or editing properties of each of myclass2 objects, depending on various conditions.

if more information needed, i'll surely post it. reason don't post code because wouldn't know lines around 4000 pick out. i'm not trying hold information because i'm coding super secret, don't know else might useful.

edit3: question: "could process each line @ time, or need context of whole file first?"

  • i need context of whole file. example myclass1[4000][0] depends on data myclass1[0][0], myclass1[1][0] etc. more information: reading csv file , creating "empty" objects works fine. error gets thrown later on when iterate through myclass1 objects.

edit: okay, it's clearer you're doing... multidimensional array of size 5000 x 60 still 300,000 elements. each element? think actually need more data in memory @ time machine can handle?

it's not clear you're doing - need all information in memory @ once? perhaps problem you're preventing arrays being garbage collected when you're dne them?

without knowing sort of processing you're doing, it's hard know whether database appropriate, arrays efficient you're if need value every element. if manage sparse array, different matter of course.


Comments

Popular posts from this blog

apache - Add omitted ? to URLs -

redirect - bbPress Forum - rewrite to wwww.mysite prohibits login -

php - How can I stop spam on my custom forum/blog? -