There's really no reason why DirectX 12 can't be as fast as Vulkan. In fact, the fact that converting DirectX to Vulkan makes it faster sort of proves that point.
I love how the second half of this article is obviously just an AI slop agenda. That entirely speaks to how much microsoft "cares".
Frankly, the things they've listed as action items for the future are things that they should have been doing FROM THE BEGINNING.
Like, how on earth was
> Faster and more responsive Windows experiences
NOT a part of just the general release cycle of a major windows update? How was it they didn't notice that the file explorer experience in 11 was noticeably worse than windows 10 and the same hardware?
We all know the answer, it's because the highest priority wasn't a good UX, it was to make sure copilot was integrated into everything.
So long as microsoft management doesn't prioritize performance (and they clearly do not) this is just a natural endstate of any software. If you aren't focusing and paying your developers to make things faster and smoother, you'll get this sort of high memory consumption and janky applications. Making things not janky requires someone in management to care about that.
You aren't well connected to GenZ. Especially Gen Z not going into higher education.
The only computers they tend to own are phones, tablets, and maybe a game console.
Heck, my millennial sister in law got her first computer because of covid. Until that point the only computers she was using was her work computer and her phone.
I agree, a tablet isn't as capable as a laptop. However, a very large portion of the population doesn't need those capabilities. They just want something to watch netflix on.
> has the potential to be good for parts of the us energy sector.
No way this is good for anyone other than oil producers. The only potential positive it'll have on the US energy and shipping sectors is this is going to put even more pressure on adopting renewables as fossil fuel cost spikes.
That's good only for inflation, nothing else. Renewables are included, after inflation and tariffs they won't become more attractive compared to carbohydrates.
> Markets are now forecasting oil prices will stay above $100 a barrel for multiple years
It'll never go below $100 a barrel.
It went bellow $100 a barrel for the last few years because as US shale processing came online, opec decided to also keep production high which cut oil prices from $100/barrel to ~$50/barrel where it's roughly stayed for the last decade.
There's not another "new way to extract oil for cheap" technique on the horizon. Israel and Iran are both destroying oil extraction and processing facilities in the gulf region, it'll take years and a huge amount of money just to rebuild those. By the time that's finished, assuming it finishes, inflation will have firmly caused the price of oil to stay above $100.
This is basically a permanent increase. We are sort of at global warming catastrophes now. It's not a question of if it will be bad, it's a question of how long and how intense. The longer this war/military operation/regime change/whatever we are doing goes forward, the worse it will be. And, unfortunately, I don't think there's any specific goal the trump admin is trying to achieve. This is such an obvious F-up and Trump will only pull out if he can somehow make a claim that it isn't.
> Israel and Iran are both destroying oil extraction and processing facilities in the gulf region
This isn't like Katrina where oil infrastructure was being temporarily evacuated, shut down, and taking some water and wind damage.
The oil infrastructure is being blown to smithereens. And not just pumps that are sucking oil out of a hole in the ground. Refineries. Big expensive factories that process oil. Stuff we don't even bother to build in progressive parts of the world because the combination of environmental regulations and concerns about climate change mean it's possible they'll never pay off their massive construction costs.
> Stuff we don't even bother to build in progressive parts of the world because the combination of environmental regulations and concerns about climate change mean it's possible they'll never pay off their massive construction costs.
Ignoring pollution and externalities for the sake of argument this is what is very interesting to me. It’s not clear that the capital markets if left to their own devices would even invest in rebuilding these to begin with due to concerns about being paid back.
The oil industry has went from growth based investments to capturing returns on current deployed infrastructure over the past decade or so. Only very limited and calculated capital is being deployed in this sector these days.
It will certainly be interesting to watch. I’m certain those countries will rebuild via government funds, but I’m wondering how profitable that will end up actually being given how expensive that’s become to build, the insane construct lead times these days, and the overall trend in oil demand. Natural gas is even more interesting since it’s firmly directly linked to renewable energy deployment. Gas usage goes up as we deploy more solar worldwide, at least in the short term.
That was basically entirely on Carly Fiorina, Mark Hurd and the board of directors. It's pretty similar to what happened to Boeing.
HP had engineers at the helm right up until Fiorina. She came in and destroyed a lot of what made it great to work at HP while not really doing a great job of managing the company.
Then Hurd came in and he just gutted the company to the delight of the shareholders. I came in right as Hurd went out as an intern. The place was in shambles when I got there. He'd fired and outsourced everything he could. The IT there was a complete joke. It was actually insane that HP decided to outsource IT operations.
Not much of a story. Like I said, I was an intern so I mostly heard this stuff from my coworkers (it's been a while too).
My boss was a manager in IT and they were fortunate enough to get a heads up before the shitshow hit. They moved departments right before everyone in IT got laid off.
I had requests to IT that I had put in at the beginning of my internship which were just getting handled by the end of my internship.
Real basic stuff like getting my badge was a nightmare. I had to make a 3 hour drive to another building just to get my badge. The appointment to do that took 3 months, which meant my coworkers had to let me into the office and past security every day.
General office supply and admin was really bad. I was seated in a broken chair for my entire internship. Employees were buying their own office furniture like chairs because there basically was nobody at the helm doing basic recs like that.
The IT firm we contracted out to was obviously one that mostly serviced the likes of banks or chain restaurants. The stuff they technically "owned" they were completely detached from. The only stuff they knew how to do was active directory management stuff. But like I said, they were extremely slow and backed up. Understandable because HP is huge company to contract out to.
Leadership was a total mess. I had like 3 different bosses I technically reported to and it was never super clear to me in the org chart exactly how I was supposed to be positioned in the company.
This is the second time this week on HN that I've seen people suggesting object pools to solve memory pressure problems.
I generally think it's because people aren't experienced with diagnosing and fixing memory pressure. It's one of the things I do pretty frequently for my day job. I'm fortunate enough to be the "performance" guy at work :).
It'll always depend on what the real issue is, but generally speaking the problem to solve isn't reinventing garbage collection, but rather to eliminate the reason for the allocation.
For example, a pretty common issue I've seen is copying a collection to do transformations. Switching to streams, combining transformation operations, or in an extreme case, I've found passing around a consumer object was the way to avoid a string of collection allocations.
Even the case where small allocations end up killing performance, for example like the autoboxing example of the OP, often the solution is to either make something mutable that isn't, or to switch to primitives (Valhalla can't come soon enough).
Heck, sometimes even an object cache is the right solution. I've had good success reducing the size of objects on the heap by creating things like `Map<String, String>` and then doing a `map.computeIfAbsent(str, Function.identity());` (Yes, I know about string interning, no I don't want these added to the global intern cache).
Regardless, the first step is profiling (JFRs and heap dumps) to see where memory is spent and what is dominating the allocation rate. That's a first step that people often skip and jump straight to fixing what they think is broken.
Orders by hour could be made faster. The issue with it is it's using a map when an array works both faster and just fine.
On top of that, the map boxes the "hour" which is undesirable.
This is how I'd write it
long[] ordersByHour = new long[24];
var deafultTimezone = ZoneId.systemDefault();
for (Order order : orders) {
int hour = order.timestamp().atZone(deafultTimezone).getHour();
ordersByHour[hour]++;
}
If you know the bound of an array, it's not large, and you are directly indexing in it, you really can't do any better performance wise.
It's also not less readable, just less familiar as Java devs don't tend to use arrays that much.
maybe it would be a little better to use ints rather than longs, as Java lists can't be bigger than the int max value anyways. Saves you a cache line or two.
Fair point, but it is possible this isn't a list but rather some sort of iterable. Those can be boundless.
Practically speaking, that would be pretty unusual. I don't think I've ever seen that sort of construct in my day to day coding (which could realistically have more than 1B elements).
reply