## The Langton’s Ant

###### By Anupum Pant

Think of a cell sized ant sitting on a huge grid of such white cells. The thing to note about this ant is that it follows a certain sets of simple rules. The main rule is that when the ant exits a cell, it inverts the colour of the cell it just left. Besides that:

1. If the ant enters a white square, it turns left.
2. If it enters a black square, it turns right.

Here’s what happens if the ant starts out in the middle and moves to the cell on the right, as a starting step (this can be on any side).

Now as this continues, a seemingly random figure starts taking shape. The black cells are in total chaos, there seems to be no specific order to how they appear on the canvas. (of course the pattern is always the same chaos, considering the ant starts on a blank array of cells).

And yet, after about 10,000 steps are completed by the turing ant, it starts creating a very orderly highway kind of figure on the canvas. It enters an endless loop consisting of 104 steps which keeps repeating for ever and creates a long highway kind of structure.

Suppose, initially you take a configuration of black spots on a canvas (not a blank white canvas). Take an array of cells with randomly arranged black spots, for instance. If given enough time, the ant ultimately always ends up making the looped highway. However, before it starts doing it, it might take a significant amount of steps less, or more, than the ~10,000 steps it took to reach the loop in a blank array of cells.

No exception has ever been found. A computer scientist Chris Langton discovered this in the year 1986.

## Scientifically, Do Retina Displays Make Sense?

###### By Anupum Pant

Our eye doesn’t work like a camera – with pixels and frame rates. It moves rapidly in small amounts and continuously updates the image to “paint” the detail. Also, since we have two eyes, both the signals are combined by the brain to increase the resolution further. Due to this, a much higher resolution image than possible with the eye’s abilities, can be generated in the brain. The very fact that we haven’t been able to come up with artificial devices that work the way a human eye does, confirms that we haven’t been completely able to understand this complex device yet.

But what we know about the average human eye is that its ability to distinguish between two points is measured to be around 20 arcsecs. That means, two points need to subtend an angle of at least 0.005 degrees to be distinguished by the human eye. Points lying any closer than that would mean that the eye would see it as a single point.

One thing needs to be noted that if an object subtends 0.005 degrees when it lies 1 foot away, it will subtend a lesser angle as it moves away. This is the reason you have to bring tiny text closer in order to read it. Bringing it closer increases the angle it subtends, only then the eye is able to resolve individual letters. Or in other words, anything is sharp enough if it is far enough.

### Apple Science

Retina display, the Apple’s flagship display is said to be so sharp that the human eye is unable to distinguish between pixels at a typical viewing distance. As Steve Jobs said:

It turns out there’s a magic number right around 300 pixels per inch, that when you hold something around to 10 to 12 inches away from your eyes, is the limit of the human retina to differentiate the pixels. Given a large enough viewing distance, all displays eventually become retina.

Basically, Apple has done science at home and has come out with a nice number, 300 PPI. Practically, you don’t need anything higher than that. Technically, you do.

### Isn’t “more” better?

No one is really sure. According to my calculations, an iPhone 5s’s display (3.5X2 in) would subtend 13.3 degree X 7.6 degrees from a 15 inch distance. With the kind of resolving power our eye sports, you’d need a screen that is able to display 4 megapixels on that small screen. Or in layman words, you need a screen that can pack around 710 PPI; practically, that sounds a bit too extreme (or maybe my calculations are wrong, please point it in the comments). I’d go with Steve Job’s calculation.

### My shitty screen is a retina display

So, technically any device can said to be sporting the most touted screen in the industry today – a retina display – if it is kept at a sufficient distance. For instance, my laptop’s monitor with a resolution less than one quarters (~110 PPI) of what we see on today’s devices becomes a retina display when I use it from a distance of about 80 cm. 80 cm is normally also the distance I use my laptop from. Also, even doctors consider 50-70 cm as an optimum distance from screen to eye, to avoid eye strain.

On my shitty screen, the pixels are at a distance of 0.23 mm from center to center. And at 80 cm, my eye is practically unable to see the difference between a retina display and a shitty display. So, I say, do you really need higher and higher PPI devices? But that is just my opinion.

### My Shitty phone is a retina display

As phones are generally used from a much closer distance, they require a higher PPI for the screen to look crisp. My phone, Lumia 520 has a 233 PPI screen. It becomes a retina display after a distance that is anything more than 15 inches. I’m required to hold my phone at 4 inches more than an iPhone to turn it into a display which is as good as an iPhone’s. Do I bring my phone any closer for anything? No. Do I need a higher PPI? No.

### Conclusion

Recent phones from Samsung, Nokia and HTC pack in 316, 332 and 440 ppi, etc or more. Companies are spending billions to decrease the distance between their pixels. Sony, for instance, has recently come up with a 440 PPI display. And now, we have 4K TVs. Practically, I’d say, put an end to this manufacturer pissing contest and use this money for something more worthwhile. Technically, according to calculations, I say that we yet have to develop far more complicated technologies to cram in more pixels for pleasing the human eye.

## Harnessing The Power of Nature – Biological Data Storage

### The present storage technology

Storage technology has come long way from the year 1956 when IBM, the massive corporation started pushing this technology. Its journey started with data storage densities of orders as low as 40 bits per square inch in 1956 (RAMAC 350). This effort from their side indeed brought in great results and IBM could set a record of density record of 14.3 billion bits per inch, by the year 2000.

Today, in the year 2013, most HDDs (Hard Disks Drives) are able to store with densities of around 500 Billion bits per square inch; technology at this level has brought Terabyte sized HDDs to our computers. The research being done on increasing density of data is still a bustling area. As a result, we often see news breaking in with breath-taking new storage technologies almost every month.

### Latest Stories

Just a few months back, using a technique called nanopatterning a team from Singapore was able to show 3300 billion bits per square inch. That is almost 6 times the density of a normal HDD. It means that a 1TB HDD of present size could hold 6TB if this could come to manufacturing units.

Seagate, in another story, promised data densities of the order 1TB per square inch (8000 billion bits per square inch) within the next decade. Which would enable hard drives of up to 60 TB in capacity.

A similar thing has happened to compact disks. From CDs to DVDs to Dual Layer DVDs to BluRays and several other storages that didn’t last – from zip drives to holographic storage. The data storage densities have improved dramatically.

### Is it enough?

Although, our present ability to store a lot of data in small physical spaces is enough for now, to meet the future demands we will need to keep progressing with an unbelievable rate. The fact – physical storage is reaching its limit gradually, could bottleneck our progress in the future.

### Biological Storage Devices

The exact storage concept used in amazing natural systems like the human brain and DNA has remained elusive for decades now. To keep up with the rapid pace of development it is important that we step up our work in this area. I think, the answer to our demands lies with the nature.

A brain, for instance, is estimated to be able to store something closer to 2.5 petabytes (or a million gigabytes). The sad part, we don’t exactly know how it stores. Moreover, we don’t even know how we could precisely calculate their storage limits. These estimates are just a theoretical calculation. We still have a long way to go.

### The greatest storage device

Recent successful experiments with storage and retrieval of data in the human DNA has come with a new hope for the future. Teams at the EU Bioinformatics Institute and Harvard University have successfully stored famous speeches, photos, and entire books, and then retrieved them with 99.99% accuracy.

Being able to store data in the DNA will confer upon us three advantages. Firstly, it will be fast (very), yes, faster than the flash drive. Secondly, it won’t age with repeated storage cycles (around 10,000 years), at least not like HDDs which have moving parts. Finally, DNA will enable us to reach data densities of unimaginable levels. Imagine being able to store of half a million DVD disks in a single gram of DNA!  Technically that would amount to 700 terabits per gram (measuring in area is difficult for an entity like this). Others have reached to densities as much as 2.2 petabytes per gram.

Bring DNA drives to our PCs I say!