char century [100][365][24][60][60];
declares an array with a char element for each second in a century, that is more than 3 billion chars. So this declaration would consume more than 3 gigabytes of memory!
COOL! Just thought I'd share it with you. :D
http://www.cplusplus.com/doc/tutorial/arrays.html
Right, the amount of memory can be computed by multiplying each 'dimension', and then multiplying by the number of bytes required for the data-type. In this example, you used a char---a typical one-byte type.
Well, not quite 3 gigabytes. Nearly 2.94 GB by my calculations*. Still, that is a bit atrocious, and the problem is even more devastating when not using the simple char type; imagine if you used a simple 4-byte int, or a class/struct consisting of as many as 10 bytes.
<small>*
Number of bytes: 100*365 = 36500, * 24 = 876000, * (60*60 = 3600) = 3153600000.
Number of bytes in a gigabyte = 1024 (KB) * 1024 (MB) * 1024 (GB) = 1024^3 = 1073741824.
Number of gigabytes in array: 3153600000 / 1073741824 = 2.937 (a bit nitpicky, eh? :P)</small>
Anyways, what's more interesting is how changing just one of those values impacts the size overall. As an example:
Which results in an increase of 31536000 bytes (365 * 24 * 3600), or roughly 30 MB.
Hiead