To those that say DNA seq; although the cost to perform the actual sequence has dropped, there are still a ton of overhead costs in acquiring and managing any genomic data. Context - product platform guy at biobank.
I mean, I suppose that depends on how you're defining "overhead" cost? To sequence a human genome, you _need_ to purchase a ~$700 reagent kit from Illumina to plop into your HiSeq and a ~$300 library prep kit -- are these overhead? I wouldn't call them overhead, and this fixed-cost hasn't really changed for 5+ years.
Those two costs are what makeup the $1000/human genome stat, we don't factor in the human and consumables components
The exponential part is going to trip up most of this. In the best scenarios you might get that for a very short duration of time in a segment and then see a continued, very slow decline in costs.
Space launch costs, thanks initially to SpaceX pressure.
Wireless and wired broadband access costs per bit continue to decline persistently globally. Satellite broadband costs are about to do a hard one-time plunk as Starlink comes online (another example of the short lived nature of the biggest drops).
The cost of entry for a decent virtual reality experience vs the value potential or value derived. The hardware requirements vs the median machine has adjusted massively in the Oculus era and the quality of the software will continue to persistently improve over a long period of time.
Internet retail prices have been dropping significantly, except for the places controlled by monopolies. In the EU you can get a gigabit connection for 20-30 eur/month, which seemed impossible even just 10 years ago.
Ehh, not really for any appreciable amount of sequencing, w've been pegged right around ~$1000/human genome for a while now [1]. The machine mentioned in the NYT article is almost certainly a variant of Thermo's RapidHIT system[2], which is not DNA sequencing.
Interesting fact: the price of a bushel basket of wheat has been approximately equal to a single British Pound Sterling since the 1300s. At that time the pound was only slightly denatured, it was worth only slightly less than a pound of silver.
I'm not sure. I did some analysis comparing the cost of basic grocery items in the early 20th century to now and they aren't that different. I used the price of gold at the time vs now and the average daily wage to try to get a better feel for relative prices.
Over hundreds or thousands of years, yes, definitely.
All that said, it seems like automation could push the cost of food to near zero in the coming decades.
Looking at this, it seems like in Elizabethan times, a penny was more or less equivalent to a dollar or pound today, if you look at the price of bread, beef, and wine.
TVs are still getting cheaper, amazingly. I didn't think there was much lower they could go, but $500 buys you more and more TV every years - and that's in nominal dollars, not counting inflation, so the effect is even more pronounced.
I'm assuming an 11 year old trading stocks has parents that have money to burn and receives it from them, so this wouldn't apply to typical 11 year olds.
the price of genome sequencing is no longer dropping exponentially. And when it was, it was heavily amortized across very expensive machines that masked the true costs.
Let's just say that every analysis of sequencing costs I've seen has to play extensive trickery with the various costs (the capital cost of acquiring the machine, the operational costs of the reagents, the costs associated with staff running the machine, the servers required to store the data after it's acquired). Basically the stated prices are "the operational costs for a reagent" with the cost of the machine subsidized out or amortized or otherwise hidden from the calculation.
Many of the predictions about exponential genomics were done during a period of a few years where those misleading rates were published. I spent a bunch of time trying to figure out how to engineer systems to deal with exabytes of genomics data, but there is no reason to do that, because there isn't that much data being produced/retained or that needs to be maintained on spinning platters of rust. Just another example of bio hype, unfortunately.
> -hosting (became free in last couple of years thanks to Netlify, GH pages etc.)
Uh, GitHub really wasn't the first to offer free hosting, and it's definitely not something of the "last couple of years". Geocities has been doing that since 1995 and I don't know if they were the first. Moreover, GH Pages is super limited, others allow you to run dynamic scripts (PHP being the most popular). I never heard of Netlify but Wikipedia tells me it's standard shared hosting for static content.
As for "cloud" storage, that's getting cheaper in the sense that they are reducing profit margins (getting closer to charging the actual cost price), but if you roll your own, you can still do it for less than what the cheapest third party charges. And those costs are also dropping year over year. With both effects combined, I guess it's dropping quite rapidly, but for two separate reasons and not because of advances in hard drive tech.
It's static shared web hosting, with a few cool features like a built-in CI, server-side analytics and serverless functions. They also advertise worldwide high availability.
I don't think they ever hit parity with desktops, mainly due to thermal constraints. There's no way that a 15W laptop cpu can ever come close to a 95W desktop cpu in anything other than short term (< 15s) single threaded performance.
There's probably some limits to the cheapness. I think hard drives are going to be always at least $35 no matter how small the storage space is just because of the parts and manufacturing.
Probably the clearest chart that explains why machine learning is blowing up now, despite the fact that the same neural network architectures existed since 1990s (e.g., LeNet-5).
It's because cost dropped from $47,000 per GFLOPS to $0.02!!!
That $0.02 value is entirely bogus. It is based in the published 30 "teraops" for Xavier, which it uses as teraflops and assumes is comparable to the other flops values. ($699/30 "TFLOPS" = $0.02 $/"GFLOP"). "Teraops" counts int8 performance on the GPU plus the random vision and CNN ASICs in Xavier - it is definitely not equivalent to normal fp32 flops.
It looks like the GPU in Xavier can do 1.4 TFLOPS single precision, so the correct value is $0.49.