Big data is a misnomer. Complex data is a better description. Having a terabyte of simple data with 2 columns is really not that difficult to analyze and won't give you much information. Whereas having a few hundred mbs data with complex relationships and many dimensions can yield tons of information and is far more difficult to analyze.
Difficulty in "big data" should be about its horizontal breadth (covering many aspects of a system) rather than its vertical depth (covering one aspect of a system in great resolution).
The devil is in the details. Big Data is really a massive cluster of VMs running maxed out Excel spreadsheets, and instrumented to restart automatically and restore from redundant backup, a la RAID, when the Excel process crashes one of the Windows VMs.
Current limit is 1,000,000 pr worksheet. However there is a tool called PowerPivot which lets you get around that limit and do analysis on larger data sets.
He's saying his file has way more rows than that, so maybe they upped the limit in the more recent versions of excel? (I think he also wrote a bunch of VBA and hooked into some external systems too)
"sometimes I think Big Data is just Excel on 128GB of RAM"