Logo 
Search:

MS Office Forum

Ask Question   UnAnswered
Home » Forum » MS Office       RSS Feeds

Writing Ram data in the hard disc in some suitable format efficient

  Asked By: Ryan    Date: Aug 11    Category: MS Office    Views: 719
  

I am working on a combinatorial application where problem expands
exponentially and data need to be saved. I have ran out of my Ram
memory in storing an array a(2 million, 100) as byte. Is there a way
to quickly write 1.8 million rows of this array Data from Ram array to
a file excel sheet or text file etc so that I can use 1.8 million row
space of the array. Do you have idea how much time it may take, 10
seconds or 1 minute or 10 minutes.

Later on I may need to retrieve information from the file for some of
these rows. Which file type for writing will be appropriate so that it
can be retrieved efficiently.

Share: 

 

4 Answers Found

 
Answer #1    Answered By: Bonita Garcia     Answered On: Aug 11

Sounds as though you would want to use a random access file.

No real feel for how long it would take. But, if it were able to do 1,000 per
second (which it probably wouldn't be), then that would still take 1,800
seconds, which is 30 minutes. It'd probably be significantly slower than that.

The concept of trying to handle 2 million row arrays inside Excel is just wrong.
You need to rethink your algorithm to eliminate or minimise the array, or move
the whole thing out of Excel into something that has a chance at doing this in a
more efficient  way.

Your last comment tends to indicate that you are going about this the wrong way:
"Later on I MAY need to retrieve information from the file for SOME of these
rows" (my emphasis). Why would you produce information that you don't need? Is
this a static data  file that can be created once and then used many time?

 
Answer #2    Answered By: Elmer Young     Answered On: Aug 11

These combinatorial algorithms in general create around 500 million
nodes. For each node, we just know it's minimum and maximum possible
solution given by it and to know actual value, we have to solve it
and spend computational time. Actual solution could be any value
between minimum and maximum value for that particular node. We are
trying to find best minimum solution.

If the best obtained so far is 550 units, I will automatically
reject any further node with minimum value>=550. Therefore algorithm
automatically rejects around 450 million nodes. but I would need to
save nodes with minimum solution value 548 or 549 at this stage.

And as algorithm proceeds, I may find a better solution 548 then I
can reject all nodes with minimum value 548 or 549 and can clear up
some memory. Number of nodes keep increasing and decreasing in the
algorithm creating 10 million nodes (with minimum soltuion<best
solution) intermediately. As ram  can only manage 2 million nodes, I
would have to write remaining nodes (with >= minimum solution value
than 2 million nodes) somewhere in the hard  disc in binary or text
or -- format  whichever takes less time. In 80% cases, I never need
these 8 million nodes written in hard disc  but in 20% cases, I do
need some of them (1 to 4 millions) back after solving others
present in the Ram (therefore Ram is cleared of 2 million previous
nodes).

 
Answer #3    Answered By: Glenda Frazier     Answered On: Aug 11

Why not just re-create the nodes when you need them?

If there are theoretically 500 million nodes, but you are only using some of
them, then it sounds like you are creating nodes unnecessarily. Why not only
create things when you need them? Or indeed, revisit your algorithm totally and
eliminate the nodes entirely, or only generate some figures and interpolate, or
whatever.

Production of 500 million reference items is not a common thing to do -
especially as you don't then use them all. Much more to the point, it is not
normal to do it in a semi-interpreted environment such as Excel VBA.

Try to eliminate your problem  rather than trying to shoe-horn a solution into a
system that is not set up for it.

 
Answer #4    Answered By: Vilmos Fischer     Answered On: Aug 11

To date, any combinatorial algorithm made by 100s of researchers in
this area including mine on such problem  will take 5 times or more
the computational time (say 5*3= 15 hours or more for my algorithm)
instead of (3 hours) if we do not create extra nodes.

And as I told you earlier, 450 million nodes are immediately
eliminated at creation. Therefore only 50 million nodes are
processed. And at any intermediate stage, I do not have more than 10
million nodes. Is there a fast way to write these 8 million nodes
data in a binary file etc.

 




Tagged: