I am trying to quantify my computer's performance in a more visceral way. My processor, for example, runs at 2.8 GHz across 6 cores, meaning each core can execute 2,800,000,000 instructions per second for a total of 16,800,000,000 instructions per second. Essentially, I am trying to put familiar CPU specs on a human scale.

The difficulty I'm having is in quantifying memory. For background, a byte consists of eight bits, each of which can store either a 1 or a 0. We can describe the total amount of memory a computer has in bytes - in my case, 32,694,568 (roughly 32 GB). Since a byte contains eight boolean values, it can store 256 unique combinations of values - in other words, it can store any number between 0 and 255. **What is the largest possible value one can store in 32,694,568 bytes?** I know the formula is going to be 2^{8*32,694,568}, so I typed this into my terminal:

```
python3 -c "print(2**(8*32,694,568))"
```

And as my computer began to heat up, I realized it would literally take every byte of RAM I had available to calculate this number. WolframAlpha also was unable to provide a sufficiently precise estimate (can't say I blame them, as it would require 32 GB from their server **without an intelligent computation strategy**. A colleague of mine recommended I look into calculation streaming - either calculating the number in chunks or in a stream. Unfortunately, I do not know how to do this.

While I do want to print that ridiculous number out on my screen, at this point I'm actually more interested in how I would go about doing this. I am a software developer, so if anyone knows an algorithm for calculating numbers of this scale in chunks/streams I would love to learn. Also, if you have more than 32GB of RAM, and can do the brute force calculation, I would love to know how long it took your machine to calculate, which can be surmised by wrapping the previous bash command in:

```
time{}
```

I appreciate any help, thanks for reading!