Hacker News new | past | comments | ask | show | jobs | submit | notTheAuth's comments login

Pretty sure LESS industrial feedback loop generating behavior is the best climate-change solution.

Nuclear power probably seems pretty great to WSJ and capitalist readers who want to own it all, and need the public to be ok with nuclear power again, though.


> Decentralization seems to work well for cases …

I skipped copying it all because it’s right there; not limiting my reply to that snippet.

Yeah information network signal attenuates over longer distances (space and time).

The internet created an ansible for assholes to hassle us.

Knowing this, fuck the open web. I want an email provider that just straight up blocks messages from senders I haven’t approved.

I’d like Signal to go the extra step and make communication over a 1:1 link, maybe using Wireguard, an option, as a fallback and to put other options in front of less savvy users.

Spam is just advertising by agents that won’t kowtow to government. I don’t need schemers in big biz or Africa.

Decentralizing of communication will never happen running some platform centralized around rugged individualist hackers motives. But we don’t exactly need FB, Twitter, etc, either.

Our phones are just TVs for business opportunities. I’ll just send my adventures straight to people who care and skip the noise thanks.


Xfinity provides internet while Comcast screws customers

Trickle down is a euphemism for peeing on people from the top

It’s all obfuscating semantic games to conceal their application of agency, not literally, but cognitively


Security will play a huge role in obsoleting software development as a job.

Monkeys in chairs papering over generic CPU design is pushing chip makers to consider silicon designed to workload spec; input parameter set, let it go.

Chips are now undergoing their great decoupling like software. It’ll take a while as manufacturing process pivots but rather than 8 generic cores we’ll eventually have SOCs per application. Software will be pushed to the UI layer alone for users, and whatever industry needs to boot strap manufacturing.

Frankly I’m looking forward to it; I can’t think of anything software companies have provided humanity that will stand the test of time, except making us all learn their new preferences.


> pushing chip makers to consider silicon designed to workload spec

Well .. the big one of these is GPUs, which started as fixed-function pipelines and turned into massively parallel execution systems repurposed to cyrptocurrency.

Various people have tried to do niche things such as neural net coprocessors. General-purpose or GPU-like or DSP-like systems tend to win in the market ... because they're run on software, which is more flexible.

There's a reason ARM are now up to 180 billion cores shipped; the easiest way of making a custom IC is to wrap it around an ARM core with some firmware. The code is burned into ROM (which has security advantages and disadvantages!), but to me it still counts as "software".

At some point the iPhone will cross the triple digit of number of CPU cores onboard (counting across the whole board, not just the SOC and radio). It will be rather difficult for anyone other than apple to count when that is.


I’m going to rely on the insight of my friends in the chip biz working at Intel and Qualcomm back in the day; generic CPUs “won” as it became aware manufacturing anything different was literally impractical 10+ years ago.

The goal of custom to spec silicon never went away it just had to bide it’s time, then Intel got taken over by MBAs.

Choices were made based upon real data. That generic CPUs “won” due to consumer choice; anything winning due to consumer choice is just a handy political meme. We get what supply side can sell for the most profit.


So the whole reason we have general purpose CPUs is because in the 50's/60's they were hulking giant beasts both in size and power consumption - computing was only economical on these devices if shared.

Things shrunk and now everyone has a computer in their pocket - which in most cases is a window to the next generation of hulking giant beast - the "cloud".

So if I'm going to develop applications for the cloud instead of PCs or mainframes, it's going to need to be general purpose--because it's shared and rented by the minute/CPU cycle. Just like the mainframes of old. So some notion of general-purposeness will always be there.

Maybe hardware will support and fossilize around programming languages? One can argue that x86 is doing that and is essentially an "SOC" for C.


It seems to me that "general purpose" was a big driver of PC's and phones.

PCs are good for word processing, games, scientific computing, tracking satellites, etc.

The appeal of a smartphone is that it replaces a dumb phone, a walkman, TV set, video game console, watch, and many other things.


General purpose won as it was realized monolithic chips that did it all were infeasible due to manufacturing technology in the past.

Overtime the political story about market winners took over, and technology development became mired in MBA bean counting to extract wealth, as it became clear the public would happily consume what was fed them via PC screens like we did TV.

None of these textual objects or CPUs will mean anything in the future. Manufacturing technology will evolve to provide “end to end” computing gadgets, black boxes that need user interface code at most, to contextualizing outputs. We’re subsidizing a process of technical evolution at scale.

Describing this all through the words of contemporary political discourse is missing the point.


There’s an argument to be made that political austerity causes social segregation and people to die in poverty of preventable problems.

Why don’t we set aside the rap battle and Mathematica concerns and focus on well known human society features failing people.

Cut the crap with the pearl clutching “there’s an argument to made other peoples behavior did…” and consider yours as a member of an implicitly caste based society.

Treat yourself like the subject of study instead of abstractly focusing on the flock, dad.

Western psychotherapy role play is the worst personality trait.


How is this different than instigating demand for Lamborghinis and 30 room mansions?

Gaming people to do things of nonsense utility is about all humans have to do after basic life supporting logistics.

Seems like mining fake objects, while power consuming, is less literally damaging than rocket ships and garages full of cars, boats, and 5 mansions that never get used.

It’s not outside the realm of possibility pretend loot is what the masses have to look forward to “owning” aside from basics down the line.


I think the main difference is that when buying a home or car, the person knows what they're getting and for how much money.

There is no intentional sunk cost fallacy or gambler's fallacy at play.

What you've posted would be valid if the topic of discussion were NFTs or skins where a person gets exactly what they pay for, no more or no less.

A gacha-style game operates differently from an NFT or car or house. It's not "If you pay $200 you get this digital item in the game", it's instead "If you pay $2, you get a 0.6% chance of getting the item." After you spend $100, the next chance is still 0.6%, and our human brains are really bad at realizing that. That's how it preys on us.

If the game were instead "Pay $200 and you get this digital item", I would consider it less predatory, and I bet it would also have far less profit.


Genshin (and many other gacha) are actually more like the second option since they feature a mechanic known as pity where after a certain number of pulls the chance to get the item goes up to 100%. In Genshin that is at 180 pulls but because they also ramp up the chance to a featured item as you pull more realistically it is more likely to happen at around 150-160 pulls. This is still crazy (especially if you buy all of these pulls with money because the dollar to pull exchange rate is absolutely ludicrous) but it does mean that an F2P player can guarantee that they will get a character they want after a couple of months of saving.


Agreed. Just put your commercials on YT, Twitch, but always link to your wallet; whether it’s Patreon or a tshirt store elsewhere.

Decouple. Take advantage of network effects. Model income generation wide and deep.

It’s all about extending social geometry.


If Google was a bit more like Apple, all of these things wouldn't be allowed under YouTube's TOS.


Fair doesn’t mean the same.

I align with Apple a bit more as they actually do interesting manufacturing R&D; they’re all terrible on the software and privacy side.

Google is 3-4 useful websites, cloud software hype, and resource consumption.

We could write desktop software that recurses over personal data, abstracts useful metadata, and share that with each other. Users pay for bandwidth when they could just utilize their computer better.

Somehow we’ve anchored our agency to doing that via cloud providers who externalized their real costs onto startups which is why they’re rich.

I’m really hopeful the future of hardware comes with power savings and performance that make building a business with off the shelf parts tenable again. But who knows


“Enthusiast” CPUs seem pointless. My CPU is never above 20% in AAA games as it’s all on the GFX card now.

Good to see CPUs going through the great decoupling that software did.

IMO Steamdeck is the future of home desktops. Both my kids are into science; I’m excited to have a drone remote, sensor base station, generic pc, etc, in a high quality package versus something like Pine phone.

Valve and Apple are pushing hardware forward. Hopefully they can obsolete needing data centers of generic CPUs and tons of Byzantine software by making hardware with the best logic for a task built in, available to home users.


Just a question about gaming. I haven't really seen any good AAA games worth playing anymore. The GPUs and CPUs have great capabliities but I don't see any good games anymore that make buying the hardware worth it anymore. Most of the games that are good I am interested in don't need good hardware. Do you feel the same trend when you play games?


I don’t know whether it applies to you, and don’t even know whether it’s true, but I think that may have less to do with new games being worse than with you being older and/or having seen more games. Getting older makes people less inclined to be obsessed with games, and having seen more games decreases the chance of a new game being an outlier, and outliers attract attention.

I think this applies to other fields, too. Watch your umptieth Super Bowl, and chances are you will think back to the ‘better’ one you saw when you were young. Twenty years from now, the kids watching their first one now will say the same about this one.


I don't completely disagee, but the point is that games haven't really gotten better, gameplay hasn't improved for many games (look at Cyberpunk 2077), there are more and more HD remakes since they aren't making new good games that excite people (SC2 was never as loved as SC1 same with Diablo 2 vs 3), and graphics have improved but gameplay has not. I think Nintendo is the most consistent with good new games but thats not really relevant to PC gaming.


Well Nintendo titles usually receive the best kind of HD upgrades on PC. Modders have even created a PC only DLC for Breath of the Wild.


"Best HD upgrades" is just because the default is upscaled, older consoles like the Dreamcast had amazing HD PC upgrades too (probably since the default resolution was 640 x 480, the emulated BoTW on the Wii U was 720p). Very interesting DLC, is it just the WiiU verison or switch version?


Yeah I think that’s a side effect of knowing how the sausage is made.

I have written my own ECS loops, rendering pipelines; all naive but after that it’s optimizing to product fit, and product emotional themes are pretty copy-paste to satisfy social memes.


They still occasionally come out but it’s rare. I haven’t been happy with an AAA game aside from Prey recently (2017 as “recent”) and Cyberpunk 2077 was all hype and no substance. I think they’re running out of new interesting games (Paradox, and Arkane are still good studios though) and many games I’m intrigued by are just remakes.

Starcraft Remastered, AoE II HD, System Shock, the Halo collection for instance, the Homeworld remake didn’t even interest me since I heard it was worst in some ways with hit boxes. They also don’t need new graphics cards. It’s so different from when PC hardware upgrades and games were so much more closely coupled.


Sony has started to port prestige first party PlayStation games like Horizon Zero Dawn to pc. If you like that sort of thing it's worth checking out...


Feels like we've hit a plateau in terms of graphics for nearly a decade now in terms of "good enough" or "realistic enough."


I haven't been excited for graphics since Crysis in 2008, nothing after that was very impressive in comparison.


> My CPU is never above 20% in AAA games

That just means your GPU is the bottleneck, not that the CPU couldn't be utilized more.


Basically all of my favorite games get worse simulation performance (FPS or UPS) the longer you play as the simulation becomes more complex (Factorio, RimWorld, Oxygen Not Included, Kerbal Space Program, Stellaris) so the performance is absolutely desired.


There are plenty of games that actually use CPU: Microsoft Flight Simulator, Factorio, Stellaris, Total War, pretty much any city simulator game, etc. Sure, your average dumb AAA action game won't, but that doesn't mean a good CPU is worthless.


Factorio isn't stressing 8 core CPUs. Stellaris can be played on a laptop. You are confirming his conclusion that they don't need a strong CPU to play.


lol! Stellaris can be played on a laptop, but try ramping the Galaxy size up to 1000 and/or increase the habitable planet multiplier. You get a couple hundred years in and the game just crawls even on nice hardware. Its not unplayable, its a strategy game, but the pace definitely slows down a lot, and space battles aren't as fun to watch.

By the same token you can play virtually any game on a cheap gaming rig. Just put all the graphics on low, run it at 720p and be happy with 20 fps.


Most games don't need the latest or greatest hardware to run well, there's a lack of good AAA games that make the value proposition of new hardware much less appealing versus the days of wanting to build a computer to play Crysis.


Endgame Factorio stresses CPUs because rocket-per-minute bases are a thing.

1 RPM is where a mega base starts. Stronger players can do 20 RPM (yes, a rocket every 3 seconds).

In those conditions, your CPU becomes the limit to the RPM as your game starts to slow down


> Stellaris can be played on a laptop. You are confirming his conclusion that they don't need a strong CPU to play.

Movies can be watched on phones. Does that mean theater screens are pointless?


Just look at the attendance of movies or how often phones are used for videos.


With different software pipelines they could run right on a GPU

It’s all state in a machine, and ML is showing us recursion + memory accomplish a lot; why all the generic structure in x86 if we can prove our substrate works just as well with better power efficiency if it’s structured specifically?

Chips aren’t concepts, they’re coupled to physics; simplify the real geometry. I think that’s what Apple is really proving with its chips, and why Intel is trying to become a foundry; they realize their culture can only extend x86 and x86 comes from another era of manufacturing.

I got into tech designing telecom hardware for mass production in the late-90 and early-00s. I just code now but still follow manufacturing, and have friends that work in fabs all over; this is just sort of a summary of the trends we see shrug emoji


Is that a realistic goal to run all on GPU? Nvidia wants ARM to make GPU/CPUs together. The idea is as intriguing as making games that are OS independent and just run bare metal by making them with ISAs. I don’t think there’s games that do that.


If it's all on the GFX card, why is there a performance difference in games between Intel and AMD CPUs?


When testing games, CPU reviews tend to test reduced resolutions and quality settings with the highest-end GPU they have, as a means of highlighting the differences between the CPUs.

While there aren't any nefarious intentions on behalf of the reviewer, this approach runs into the following problems:

- People buying high-end GPUs are unlikely to be running at resolutions of 1080p or below (or at lower quality settings), and won't see as much (if any) performance difference between CPUs as what reviewers show.

- People buying lower-end GPUs are going to be GPU-bottlenecked, and won't see as much (if any) performance difference between CPUs as what reviewers show.

- Each frame being rendered needs to be set up, animated, sent to the GPU for display, etc., and like all workloads, there's going to be portions that can't be effectively parallelized. As such, the higher the frame rate, the more likely the game is to be bottlenecked by single-threaded performance, which is an area where Intel CPUs have traditionally been strong relative to AMD's. However, as frames get more complex and take longer to render, the CPU has more of an opportunity to perform that work in parallel, and raw computational throughput is an area where AMD's modern CPUs have been strong relative to Intel's. So just because a CPU has leading performance in games today, doesn't necessarily mean that will hold in the future as game worlds become more complex (and reviewers revisiting the performance of 2017-era AMD Zen 1 vs. Intel Kaby Lake in recently-released titles have already started seeing this).

In short, the way that reviewers test CPU performance in games results in the tests being artificial and not really reflective of what most end users would actually experience.

After all, a graph showing nearly identical CPU performance across the lineup and the reviewer concluding, "yep, still GPU-limited," doesn't make for an interesting article/video.


Check out bio electrical regeneration research

If you haven’t come across it, the teams inject drugs that do nothing to cells but instigate an electrical field effect. What happens is regrowth of a limb to the correct “spec” even though that physical information is gone (limb amputated).

This suggests to me an equalizing effect exists, where fields and matter feed each other just enough to reach structural equilibrium.

Relativistic information network effects, proving what math objects create which field effects, and the social impacts, are going to become huge and blow away our current engineering goals of making hard silicon computers.

We might be able to use nature itself as our CPU.


It’s been known for a while that human communication is the real impediment to technical development.

Companies I’ve worked at that sucked had awful internal communication. It was all very friendly, but it was all platitudes and euphemisms, dumpster fire technology implementation.

Phone and web apps are basically librarian work these days. If a businesses tech stack is having issues it’s human communication that’s the real problem.


> Companies I’ve worked at that sucked had awful internal communication. It was all very friendly, but it was all platitudes and euphemisms, dumpster fire technology implementation.

Honestly, I think part of the problem is people going around talking so much about soft skills and how people don't like to be treated harshly that they've forgotten you need hard skills and nearly every profession where they need real leadership you get told where you screwed up and how to fix it. Literally, at the company I work for, people screw up and all you hear about is positive things


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: