Hacker Newsnew | past | comments | ask | show | jobs | submit | darkfloo's commentslogin

Can also offer anecdotal support from esports I know about (League of Legends , StarCraft 2, and counterstrike) For StarCraft 2 the best historical players are all child prodigies , same for Formula 1 , all recent World Champions are child prodigies


Anyone better at physics than me willing to explain how this does not break general relativity ? I might be missing the point of the experiment completely to be fair


Nobody has fully combined GR and QM, but the ELI5 summary of the best efforts is:

There's no paradox from quantum state being transmitted faster than light, because quantum state is, fundamentally, not directly measurable.

This may lead to a follow up question of "OK, so why do we even care?", to which the answer is that the relevant quantum state (the entanglement bit of it) causes correlations in other measurements. If you only read off the measurable value of an entangled particle, you know nothing. If you do a quantum teleportation, you also send information in a non-quantum channel, which you can combine with reading the entangled particle, and with both you now know more than you knew from just the non-quantum channel.

In the case of internet encryption, this mainly means sending a random key and knowing it hasn't been intercepted, because any interception would break that correlation.

Because a quantum teleportation must also send information classically, it's never FTL.


Yes , this is fundamentally an incomplete view , Japan index peaked in 1989 before peaking in again this year, if you bought at the peak it would have taken you 35 years to recover your money. Past performance is not indicative of future performance


It was always pretty nebulous, relevant video by DoshDoshington https://youtu.be/FT6XfaHgyh0?si=xayqzhkkmYjB4_UC


No, it wasn't always nebulous. Roguelike was a well-established genre for decades before it got hijacked and now means nothing.

Like all genres, games within the roguelike genre (or what some people call "traditional roguelikes") have some variance. But if you played two games in the "traditional roguelike" genre, you'd definitely feel the similarities.

These days if you pick two random games on Steam with the "roguelike" tag, you're going to get two experiences which are not even reminiscent of the other.


Great video


That only works if rates don’t go up which is probably not a good premise to rely on


If you are in a balanced budget or continued deficit situation, then, yes, increased rates will eventually be a factor (but that's a lagging effect, even then) if you have a sufficient surplus that with the effect of inflation increasing its nominal size with the same real revenue and spending you can pay down debt at least as fast as it comes due, so you aren't going back to do new net borrowing, increase cost of borrowing doesn't matter much.


As a semi casual user of python that had to battle w/ dependency management recently, can you elaborate on why that would not be a good thing ? I thought about switching our project to uv but could not find the time necessary


Sure – and I think it’s certainly proving to be a good thing so far! My concerns are more longer-term. I see two primarily:

(1) As uv’s governance is driven by a for-profit company, I see incentives that will eventually compromise on its benefits.

(2) Python packaging has historically been very fragmented, and more recently there’s been lots of work on standardization. That work will be impacted when users massively shift to one package installer.

Neither of those things are clear negatives, but they’re worth being aware of.


> That work will be impacted when users massively shift to one package installer.

Charlie Marsh (who founded Astral that develops uv) is very engaged in the standardisation process around Python packaging. The whole idea around uv is to be something that follows the standards as much as possible. uv has been much more aggressive about conforming than the other package managers.


yep, I really appreciate their current efforts, but still think it’s a point of concern. Feels risky to have so much of an ecosystem resting on so few people (bus factor, governance, etc). Hopefully with Astral being a for-profit business they’ll find ways for their work to be more sustainable than other package managers’ maintainers.


You may be overestimating the amount of time it takes to switch to uv.


Took me all of about 10 seconds after I decided to switch from Poetry and PipX. Been just learning it bit by bit as I go along and been really pleased with it thus far.


Medium to some extent and large format can produce some exceptional image quality (Sharpness, details and contrast). Cliché at this point but Ansel Adams work still look very modern today. They where however slow, heavy ,difficult to work with and extremely expensive so most people stuck to small format when they became available . In fact I would bet that most pictures taken before small format took over look better technically than after it took over


Slightly unrelated but is there any way of maintaining that low of a temperature (77k and 10k according to the paper numbers) that does not immediately kill perf/w and perf/$ ? Otherwise might as well just buy more cpu


The minimum amount of work needed to pump some amount of heat Q from a temperature T0 to a higher temperature T1 is W = Q*(T1/T0 - 1). For example, if your ambient heat sink is at 20C (293K) you need at least 2.8W of electricity to run the cooler for every 1W dissipated at 77K, or 28.3W for 1W dissipated at 10K. This is the thermodynamic lower limit, and practical heat pumps will be less efficient in general. In practice it might be something like 4x and 50x, respectively.


Leakage current is what heats up the chip, and if it drops by five orders of magnitude when it's cool, the energy requirements for refrigeration will be low. Memory chips are already not that power-dense (on the order of 10W for a DIMM) so we're only talking about extracting 1mW of heat from the cryo chamber.

>As IOFF at 77 and 10 K decreases by four to five orders [29], the primary constraint of building a large memory array, i.e., leakage current (Ileak), will not be a major concern and will lead to novel design tradeoffs for memory optimization.


This comment assumes that the leakage current is all of the power draw, and not just the majority of it. I find it unthinkable that leakage current is 99.99% of the power draw of SRAM. 95% sounds believable, but then you're talking about removing 500 mW, not 1 mW.

This also gets rather tricky, because the standard way to connect computer chips is with copper traces, which are wildly good conductors of heat. A solution like this will probably need optical interconnects with the made from a thermal insulator.

It's a fun design problem to chew on


> Leakage current is what heats up the chip

Leakage current is generally a rounding error for heat. In CMOS, the power that causes the most heat is the dynamic switching power which is lost to P = C * Vdd^2 * frequency

Which implies that for the fastest chips, most power is lost simply to running the clock which has both the highest frequency and largest capacitive load.

Where leakage current matters is for battery driven systems where you spend most of your time sleeping.

I strongly suggest that you go over this lecture "CMOS Power Consumption": https://course.ece.cmu.edu/~ece322/LECTURES/Lecture13/Lectur...


But in a large SRAM, most of the gates are not switching, at any given time. The cells are mostly just sitting there holding their data.

And if cooling it lets you shrink the SRAMs that’s also going to let you reduce the capacitance, so switching power will also be reduced. I’m sure a design optimised for low temp will do some clever stuff with clock hating as well.

The problem here is that you generally put SRAM on the same die, or at least package, as the processors. And those do switch many of their gates.

So you’d probably have to do this in a case where you want a lot of fast RAM in a different box, with some really fast optical interconnect to your processing cores.


The sense lines, however, are switching--as is the clock. Just because the RAM cells are sitting there doing nothing doesn't mean that everything else in the RAM is also idle.

Also, take a look at the Apple M3 chip, for example. Note how much of the die size isn't RAM.


77k is basically the boiling point of liquid nitrogen, and 10k is probably the same for liquid helium. Liquid nitrogen is in ample supply and is not difficult to manufacture, I suppose one could have a facility on site to produce it and use it immediately. It is going to be very energy intensive though... to answer your question, I struggle to think of a scenario where it would be better than buying more compute power. I suppose for stubbornly serial workloads... but I'm not sure what that could be? Running Crysis at 20k resolution?


Boiling point of He at 1 bar is 4.222 K, and its critical point is at 5.1953 K. At 10 K helium is a gas.


Ah thank you! I'm surprised they'd pick an odd temp like 10k then. I had a vague memory of this being close to the He boiling point but couldn't remember how tight the margins were.


You can make liquid N2, though very inefficiently. So yeah, power is an issue although we are still making gains on cooling efficiency so it's not inconceivable the equation could swing towards super low temperature coolants.


I was curious, so I googled around a bit — please excuse the weird units.

- about 0.375kWh to produce 1kg of LN2

- about 0.056kWh to boil 1kg of LN2

So you get 15% efficiency; though you have “waste cold” in the exhaust you could recover if you wanted, eg, to run a Sterling engine. You still have a 220K temperature differential after boiling to gas versus ambient.


One slight advantage: you can store liquid nitrogen. So, you can use cheaper electricity to produce it


The idea I heard was to make liquid nitrogen during the day when solar power is abundant and then run the chips at greater efficiency at night using your stored liquid nitrogen.


Trading algorithms.


Right - very good point! But this is really only relevant for HFT algos, almost everything else is much less sensitive to speed and is also more parallelizable.

For HFT to work, it needs to be colocated I believe, and I haven't heard of anyone trucking in liquid N2 or producing it on premises though. Not saying it isn't happening, I'm involved in mid freq trading so I only have circumstantial knowledge.


They mention space and medical and quantum computing equipment as the target uses, where all of the processing is done @ cryogenic temperatures.One of the biggest benifits they have found is that increased density in chips is possible.The researchers behind this paper are only working with aproximate numbers, and as mentioned are useing the numbers for liquid nitrogen, but space based cryo pumps use helium, so the actual performance would improve. https://hackaday.com/2022/05/05/about-as-cold-as-it-gets-the...


On earth, difficult as you need to pay the price of being inside a 300 kelvin enviroment. But there's no such temparature in space, just the size of your radiator you'll need anyway. So there may be a very real performance improvement from doing math in space.


Radiation will want to talk to you.

OTOH, you might want to burry your supercomputer deep into the crust of Pluto (or in a permanently shaded lunar crater) with just a radiator sticking out.

Latencies between Earth and Pluto can be a problem for computing, but I would appreciate the impossibility of receiving Teams calls. Also, any AI running on that hardware will have a ton of time to think about... anything.


What is the point of burying it? Cosmic background radiation is 2.7k and I have to imagine interior of any body like Pluto would be higher than that.


More for shielding, but you are correct. With proper shielding it makes little difference.


If you really really want single-thread performance, that's where you go.


Note sure, but not all tasks are possible/easy to split among multiple CPUs so it's not always "might as well" ... Just saying.


Planes are cheaper as a function of the infrastructure needed to allow their use (? Feel free to correct me). As long as we allow airplanes companies to not pay for the long term externalities that they are creating planes will stay cheaper.


In the EU, jet fuel isn't taxed at all, quite the contrast with gasoline prices.


Iain M Banks use of weapon is both an incredibly good book and a portal to the culture series , in which almost all books are just as good.


The Culture series permanently changed my perspective on the universe. Earth is just one more of the oddball primitive worlds well off the beaten path.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: