anonymous
  • anonymous
If, in c++, I were to create a large number of very large arrays and never delete them with the delete [] command (but be sure of no memory leaks), would there be any negative consequences? That is, would I possibly run out of memory on the stack and get segfaults if I were to continue making huge arrays? Thanks!
Computer Science
  • Stacey Warren - Expert brainly.com
Hey! We 've verified this expert answer for you, click below to unlock the details :)
SOLVED
At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti quos dolores et quas molestias excepturi sint occaecati cupiditate non provident, similique sunt in culpa qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio cumque nihil impedit quo minus id quod maxime placeat facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat.
katieb
  • katieb
I got my questions answered at brainly.com in under 10 minutes. Go to brainly.com now for free help!
osanseviero
  • osanseviero
I think you would...and it would become really slow...but that depends on how big the arrays are
anonymous
  • anonymous
don't think as much about the entire machine as the processor. When a class, it's data, and the current function of operation fit within the CPU cache, you have an automatic boost in speed. When you allocate large numbers of large arrays, all memory is accessed externally to the CPU instruction and memory caches. This may be necessary if you are multithreaded anyhow, but if your intent is speed you want small useable chunks rather than big allocations. One other issue is that when you reach the 2gig boundary of a 32bit program, or the 4gig boundary of a 32bit program run in a 64bit window, you risk a crash of the software (segfault). Prior to this limit, you always risk being cached to disk to allow other software to run on the host machine. Smaller means less caches to disk (virtual memory) and faster operation.
stormfire1
  • stormfire1
Nice answer...me likey.

Looking for something else?

Not the answer you are looking for? Search for more explanations.