Ehh, that seems pretty reductive. I could just as easily claim women love games with character customization or games with deep stories. All of these things may have some truth to them. But (1) it’s unclear how universal this is and (2) it’s unclear if this differentiates women from men or is just something people in general like. “Good character design” is incredibly vague and appreciated by a lot of people.
Great article! Needless gendering absolutely hinders innnovation, in game design and elsewhere. (Not to mention the unfairness, oppression, and general absurdity).
Slightly unrelated, but the point about tutorials starting with the “basics”, i.e., “making a character move and attack” is interesting. On the one hand, if you have a strong enough grasp on programming fundamentals, it should be pretty easy to take what you learn there and make a “dress-up game”. Heck, a basic dress-up game shouldn’t be any harder than a platformer.
But if you lack that fundamental knowledge and are only interested in games, you need to develop it somehow, and you don’t want to build ‘boring’ console apps; games should be a platform for learning programming. So I agree wholeheartedly with the author: we need more diversity in introductory game programming tutorials!
Of course, that brings us to another can of worms with programming education: Tutorial Hell. But beginners need to start somewhere, and that somewhere should motivate them to continue learning and exploring on their own.
It is more than just a problem of tutorials. Game engines absolutely favor some genres over others.
If you are inexperienced and asking for advice on making a game, the most common answer nowadays is to use Unity. That is reasonable advice. Unity is a well established engine, with good tooling, a bunch of tutorials and community knowledge, and can be made to solve almost any problem you throw at it.
However, Unity is oriented around "traditional" games like the article describes: entities moving around in a 2d or 3d world. If your game fits that mold, you can have something up and running within a day; even with no experience. If it doesn't, you are going to need to spend time fighting the engine before you have even the basics of a playable environment.
Maybe you have an idea for a game that is story driven, where players read a portion of the story, then make a decision about what the character wants to do. If you know what you are doing, you would pick a light novel engine light RenPy, and you'll have your basic game environment up within a day.
Am I the only one disappointed this was about some LLM slop and not digital signal processing? DSP is a well-established technical acronym, so I expected to hear about a new Python DSP library. Oh well.
It’s had an impact on software for sure. Now I have to fix my coworker’s AI slop code all the time. I guess it could be a positive for my job security. But acting like “AI” has had a wildly positive impact on software seems, at best, a simplification and, at worst, the opposite of reality.
I feel this article's argument is weak, largely for one key reason: They don't clearly define anything. Their references might clarify some things, but not all. They argue against "general problem-solving strategies" with a reference to Polya, but they don't provide a clear definition of what these strategies entail. How broad is the set of strategies they're arguing against? What are some examples of such strategies? I'd like something beyond two sentences on Polya.
Furthermore, what audience and level of mathematics education are we discussing? The goals (and hence appropriate metrics of success) are certainly different for high schoolers targeting non-STEM careers vs. engineering undergrads vs. math grad students. The authors reference "aspiring mathematicians" and "domain specific mathematical problem-solving skills", indicating they're arguing about education for math majors, or at least students in STEM fields. In that case, the argument is somewhat meaningless - who's arguing math majors shouldn't learn math-specific skills? But, as I understand it, the argument for general problem-solving skills is that students outside of math don't actually need many specific math skills. Instead, math is a vessel for teaching logic, reasoning, and problem-solving skills. Then again, this might not be the type of problem-solving the authors are referencing - as I said above, it's not very clear.
On a similar note, they cite evidence that studying worked examples is more effective than "general problem-solving strategies", citing an "improvement in subsequent problem-solving performance" without explaining how this performance is measured. If students are tested on specific problem types, of course they'll perform better when taught strategies for those specific problem types. But it's not clear that this is meaningful. For STEM majors, sure, solving specific problems is a skill worth cultivating. But for most students, solving specific problems isn't as important as learning logic, reasoning, and general problem-solving skills. In my anecdotal experience tutoring math, students tend to just memorize strategies for specific problem types instead of learning transferable logic and reasoning skills because that's what's tested. I'd be curious to see which method of learning facilitates better performance on a more general problem-solving test of some sort.
Now, I'm not an education researcher or an educator of any sort. But I am passionate about good STEM education, especially in math. I genuinely feel that math education fails most students, at least here in America. If I'm being generous, this article is a well-intentioned but poorly-executed argument for effective math education strategies. If I'm not being so generous, this article advocates for the status quo in math education that forces students to slog through years of math classes for little discernible benefit. Either way, it's a disappointing article with a poorly-explained thesis.
> Furthermore, what audience and level of mathematics education are we discussing?
I wonder this too, I think they might mean university-level as well. For younger audiences, I feel one of the biggest problems for most people to understand math is they don't understand why any of it is relevant. If educators can make it seem more like teaching general problem solving abilities, that will likely improve the overall acceptance and lead to better overall math skills as a result.
As a specific example, our high-school math curriculum taught a lot of calculus, but framed it incorrectly as being a useful tool that people would use. Eg as if a business man would write down an equation for their revenue based on inputs, and then take the derivative to compute the maximum. I'm assuming they told students this to try and get them motivated, but it clearly was a lie since everybody knows you could just plot a graph and look at it to find the maximum. If they instead were honest that the point of learning calculus was to help with understanding more advanced concepts in math/engineering/science, while also being a valuable learning tool for general problem solving, I think that would have been a better result.
> As a specific example, our high-school math curriculum taught a lot of calculus, but framed it incorrectly as being a useful tool that people would use.
One day at FedEx the BoD (board of directors) was concerned about the future of the company and as part of that wanted an estimate of the likely growth of the company.
In the offices there were several efforts, free-hand, wishes, hopes, guesses, what the marketing/selling people thought, etc., and none of those efforts seemed to be objective or with a foundation or rationality.
We knew the current revenue. We could make an okay estimate of revenue when all the airplanes were full. So, the problem was essentially to interpolate over time between those two numbers.
For the interpolation, how might that go? That is, what, day by day, would be driving the growth? So, notice that each day current customers would be shipping packages, and customers to be would be receiving packages and, thus, learning about FedEx and becoming customers. That is, each day the growth would be directly proportional to (1) the number of current customers creating publicity and (2) the number of customers to be receiving that publicity.
So, for some math, let t be time in days, y(t) the revenue on day t, t = 0 for the present day, and b the revenue when all the planes were full. Then for some constant of proportionality k, we have
y'(t) = k y(t) (b - y(t))
where y'(t) = dy/dt the calculus first derivative of y(t) with respect to t.
Seeing how the growth goes for several values of k, pick one that seems reasonable. Draw the graph and leave it for the BoD.
That was a Friday, and the BoD meeting started at 8 AM the next day, Saturday.
First thing at the meeting, two crucial BoD members asked how the graph was drawn. For several hours, no one had an answer. The two members gave up on FedEx, got plane tickets back to Texas, returned to their rented rooms, packed, and as a last chance returned to the BoD meeting. FedEx was about to die.
I did all the work for the graph, the idea, calculus, arithmetic (HP calculator), but didn't know about the BoD meeting. Someone guessed that I did know about the graph, and I got a call and came to the meeting. The two crucial BoD members were grim, standing in the hallway with their bags packed, and their airline tickets in their shirt pockets.
I reproduced a few points on the graph, and FedEx was saved.
Interesting, but I still think most problems like that are solvable via Excel. Put some formulas in some cells, tweak some variables until you find a way to maximize something. Possibly use graphs or pivot tables or other advanced features to help if needed.
Once you’ve figured out the solution, then you build a pretty graph for the BoD proving it. Make sure to keep the spreadsheet around as evidence.
Sure, for some applications of calculus
can use just discrete steps. That is,
instead of the calculus dy/dt just use
something like (y)dt.
Then, for the arithmetic, some code can be
short and, compared with cells in a
spreadsheet, easier and with more control
over the time steps, e.g., in Rexx with cf
for customer fraction:
Say ' ==== Growth ===='
Say ' '
Say ' Customer'
Say ' Year Fraction'
max_years = 5
steps_per_year = 10 * 365
cf = 1 * ( 1 / 100 )
year = 1
k = 1 * ( 1 / 2000 )
Do Forever
Do i = 1 To steps_per_year
cf = cf + k * cf * ( 1 - cf )
End
Say Format(year,9) Format(100*cf,10,2) || '%'
If year = max_years Then Leave
year = year + 1
End
So, get a 'lazy S curve'. I've since
learned that the curve has a name, the
'logistic curve'. And, right, can also
consider that curve for other cases of
growth, e.g., for a first, rough estimate,
COVID.
Adjust some of the constants in the
program and can get more output, say, for
each month, day, etc. The code above uses
10 steps per day.
For more, someone could use the calculus
solution and compare.
In a sense, for the FedEx problem and the
assumptions about what was driving the
growth, the calculus solution is a
smooth version of the somewhat more
appropriate discrete time version.
But when I did the calculation at FedEx,
my best source of arithmetic was an HP
calculator in which case the calculus
solution was a lot easier.
Of course, this FedEx calculation was just
one example and there are many others.
My view from 10,000 feet up is that in
business, at times some math can be an
advantage if not the work of a steady job.
If some math is an advantage, then that
advantage tends to go to the owners of the
business. If a mathematician wants to get
paid for some math they have in mind,
maybe they should start a business and be
the owner.
Godot is easily my favorite game engine. The node system is simple yet powerful, truly a joy to use. For the most part, I feel like I have a good level of control over my projects without having to get too low-level. It doesn't feel like I have to organize my code around the engine's architecture, like with Unity or Unreal. Godot is also free and open-source and very lightweight. Support for C# and 3D has been improving, especially with Godot 4.0. And it has pretty good cross-platform support.
I recently read "Enabling tabular deep learning when d ≫ n with an auxiliary knowledge graph" (https://arxiv.org/pdf/2306.04766.pdf) for one of my graduate classes. Essentially, when there are significantly more data points than features (n >> d), machine learning usually works fine (assuming data quality, an underlying relationship, etc.). But, for sparse datasets where there are fewer data points than features (d >> n), most machine learning methods fail. There's just not enough data to learn all the relationships. This paper builds a knowledge graph based on relationships and other pre-existing knowledge of data features to improve model performance in this case. It's really interesting - I hadn't realized there were ways to get better performance in this case.
My thoughts exactly. The author seems to think a proprietary period is the best option. But honestly, I'd rather have data be immediately openly available and ensure we credit those who played an important role in identifying what observations should be made and obtaining the data. That really seems like the best of both worlds. Promoting open access to data and collaboration among scientists is better than promoting closed access to data and "scooping" of results/limited collaboration.
Writing blog posts has been in my back-of-the-mind TO-DO list for a while now. I just never seem to have time for it. Or rather, there are a million things I end up prioritizing over blogging. One of these days, when I'm less busy, I'll start writing some blog posts. I definitely have some ideas for posts that I think could be interesting.
But reading papers is fun and cool. You get to learn new things. If you read recent papers, you learn things that aren't just new to you but new to the entire world; you can literally learn cutting-edge things that were discovered recently. And if you read older papers, you can better understand why the world is the way it is right now.
Putting TC aside (don't most programmers make plenty anyways?), isn't it just cool to learn new and cutting-edge things? Personally, I think so. Curiosity is central to what it means to be a hacker.
reply