Hacker Newsnew | past | comments | ask | show | jobs | submit | leodeid's commentslogin

I haven't touched C in years, but here's my descending "wtf" list:

1. Returns pointer to stack-allocated data, which immediately becomes invalid. Instead, it should be using some sort of allocation (e.g. 'malloc'), or taking in a destination pointer.

2. 'r' is arbitrarily set with length 100. Smaller strings don't need all that space, and larger strings definitely will overrun.

3. The function signature is really awkward. Without any of the surrounding textbook content, I'm not sure what behavior is supposed to happen. At first, I expected something like 'strcat', which takes two char* and appends the second one to the first one. But that isn't happening here and instead it seems to require dynamic allocation. (Hiding allocations inside a function is generally kind of weird. Usually the caller should be responsible for passing in a handle to the destination.)

4. There's no sensical limit on the loop iteration. If the input 't' doesn't have a null terminator, this is going to throw a ton of garbage into the stack space (because 'r' is stack-allocated to a fixed size). And also maybe run for a really long time.

5. 'strcpy' should usually be replaced by 'strncpy', which performs the same function but also requires you to provide a limit ("copy this string, but at most 'n' bytes"). That prevents a class of exploitable errors known as "buffer overruns". I don't know when the 'n' string functions were added to C or became popular, though.

This is a teaching exercise, so the fact that this is implemented as a separate function instead of calling 'strcat' from <string.h> doesn't seem like a big problem.


> 'strcpy' should usually be replaced by 'strncpy'

Sorry to butt in, but this is a bit of a trigger for me: I’ve had to fix a number of programs infected with this idea.

The main problems with strncpy are:

When the source string is shorter than n, strncpy will pad the target to n bytes, filling with zeros. This is bad for performance.

When the source string is longer than n, strncpy will copy n bytes but _not_ nul-terminate the target. So you need extra schenanigans every time you use it to cover this case.

So strncpy is hardly ever a good idea. Sadly there is no standard replacement that is widely accepted. More details at https://en.wikipedia.org/wiki/C_string_handling#Replacements


I agree with you completely, and in general think the whole idea of using "safe" string functions with built-in buffer length checking is wrong because it is a solution to a symptom, not a cause.

Before writing to the buffer you should've ensured that it's big enough, and decided what to do if it's not, long before actually doing it. In other words, what happens if it's not big enough? These "always use $length_checking_function" proponents miss that point. Yes, you've avoided an overflow here, but chances are something was already too small long before the flow reached here, and the fix is not to replace an overflow with truncate/not copy/etc. here, but fix the check/sizing that came before elsewhere.


> Before writing to the buffer you should've ensured that it's big enough, and decided what to do if it's not, long before actually doing it. In other words, what happens if it's not big enough?

If you planned all this out, you're still making an assertion as to the length. The contract is "give me a string of this length" and if that's not enforced by the compiler, it ought to be enforced at runtime so that the error is detected and dealt with as soon as possible.

So maybe "safe string functions" should really be "fail fast string functions."


> Sadly there is no standard replacement that is widely accepted.

True, but practically: when having access to BSD extensions, strl* are used. If only C standard is available, snprintf is preferred. I have seen C libs that will check for strl* availability, and if not, reimplement them using snprintf.

So for portability, snprintf is the way to go. For correctness, and pushing for their extended use, strl* is nice.


The least bad options seem to be strl* or snprintf(..., “%s”, ...) but yeah nothing is perfect.


There's strlcpy, but it's not part of POSIX unfortunately.


  #define strlcpy(d, s, n) snprintf(d, n, "%s", s)
Not quite the same (different return type) but close.


> 5. 'strcpy' should usually be replaced by 'strncpy'... That prevents a class of exploitable errors known as "buffer overruns".

To be honest, strncpy is barely better in this respect (as a security improvement) - truncating against arbitrary size limit in this day and age of text-only protocols... I wonder if outright crashing at the testing stage would be preferable rather than subtle misbehavior creeping into the release.

Both are bad IMO, the actual required buffer size should be known in advance.


Raw null terminated strings are just a bad idea. `std::string` and the bafflingly just-introduced `std::string_view` are the right way to handle strings. We can spare the bytes now.


> Raw null terminated strings are just a bad idea.

Absolutely agree. Scanning the memory until "we find it", potentially crossing boundaries between segments of memory with different characteristics (caching etc.) just doesn't seem right in general, and if I recall, some CPUs even used to have published errata related to that.

> std::string` and the bafflingly just-introduced `std::string_view` are the right way to handle strings.

I'd even go straight to custom implementation of Hollerith strings. Literals have lengths known at compile time, protocols would either carry the lengths alongside the strings, or be trusted (to have good strlen behavior) until they do, composite strings would compute the length out of the components, etc. This doesn't seem too complex to do, looking from from my bell tower, but I know many people here would frown upon mentioning C++ in the context of embedded development (my area).


I've mentioned elsewhere that strncpy is not a "safer" strcpy.

Even if it were, there's safety and there's safety. A function (like strncat, for example) that quietly truncates your data if it's too long isn't necessarily better than one that quietly ignores array overruns. Consider what happens if "rm -rf $HOME/tmpdir" is quietly truncated to "rm -rf $HOME/"


strncpy is not safe strcpy, for any value of safe. Period. str in the name is really misleading as it's not really a string function to begin with.


> If the input 't' doesn't have a null terminator

Then it's not a string.


Sure, but it is still a valid 'char *' :)


That's something C's type system doesn't check for. If you want protection for this case, use C++ or any other higher-level language instead.


Well to be pedantic C++'s type system doesn't check for that either it just passes around a size_t and char *.


I was referring to std::string, which is what you should be using if you're handling textual data natively.


The implementation of std::string is 99.999% of the time struct { char *s; size_t len; }, which has nothing to do with an actual type.


And it's even possible to use std::string as a buffer for binary data including NULs. I won't recommend it, but it works.


I'd suggest using a std::vector of byte-sized integers for clarity, though there's nothing wrong from a standards point in using a std::string.


Also, it allocates ints x and y, saves a value to y, copies it to x, and only uses x from then on. y is entirely redundant.


The "vaccine" you refer to here is racotumomab. It's not a vaccine in the common-parlance sense ("prevents disease"), but a different immunotherapy treatment which is only approved for use in Cuba and Argentina. I'm mildly curious why it doesn't exist in the US. All of the clinical trials in the USA which I can find on that drug are either incomplete[1] or completed-but-without-results[2][3][4]. For all of those trials, "Laboratorio Elea" is the sponsor or a collaborator, so I presume they have the rights to the drug in the USA. That's apparently a company out of Argentina[5]. I don't know why they seem to have given up on getting the drug approved in USA, but wikipedia says "[a study] is underway in Argentina, Brazil, Cuba, Indonesia, Philippines, Singapore, Thailand and Uruguay", though the citation for that seems incredibly suspect[6].

[1] https://clinicaltrials.gov/ct2/show/NCT02998983

[2] https://clinicaltrials.gov/ct2/show/NCT01460472

[3] https://clinicaltrials.gov/ct2/show/NCT01598454

[4] https://clinicaltrials.gov/ct2/show/NCT01240447

[5] http://www.elea.com/

[6] https://en.wikipedia.org/wiki/Racotumomab#Clinical_trials


Check out the cancer.gov page[1], which does a decent job of giving an overview. The drug in question, pembrolizumab, is a monoclonal antibody (you can tell because the name ends with '-mab'), and you can read more about how those work on this page[2].

[1] https://www.cancer.org/treatment/treatments-and-side-effects...

[2] https://www.cancer.org/treatment/treatments-and-side-effects...


Palantir.net is not the same Palantir everyone is talking about here. Palantir.net seems to be a website consultancy for drupal-based sites.

Palantir.com doesn't seem to have anything referencing the source of the name.


Ah! Thanks for the correction.


Submission which links to the underlying MIT News source: https://news.ycombinator.com/item?id=16401904



>This approach should prove useful for producing novel quantum states of light and quantum entanglement on demand.


Maybe it is a reference to the Rick and Morty episode "Anatomy Park". http://rickandmorty.wikia.com/wiki/Anatomy_Park_(episode)


Oh, that has to be it! So it's just a reference I didn't understand.


In the intro, it is stated that "literally five or less" cities do not have these monetary problems. I'm curious what those cities are, and why are they special. If the answer isn't "they've always used accrual accounting", I don't buy that most cities are doomed due to accounting problems.


The article author Charles Marohn addresses this question in the comments section. His entire comment response:

"They are the ones with a very dominant urban core, where the urban fabric overwhelms the horizontal, auto-oriented stuff. I'm not saying these places won't struggle for the same reasons Lafayette will, but I suspect their decline/contraction will be less pronounced, less a defining characteristic.

NYC, Boston, San Francisco, Vancouver, maybe Chicago.... I'm not an expert on this scale of a place by any means so I could be very wrong but they don't seem to have the same underlying forces as a Lafayette (or even a Detroit or Memphis) where 80%+ of their infrastructure serves unproductive land use patterns. Might be 20-40% in these places."


He is inserting his own bias without considering the obvious: Infrastructure doesn't vote.

It doesn't really matter where you are, politics state that money is going mostly to people and not into the ground.

San Francisco, where I live, has loads of revenue and a citizenry which approves every single bond measure on the ballot. And we certainly shouldn't be on his list. Infrastructure issues here are never addressed until they hit a crisis level.

Just the other day there was news about how our seawall which contains our entire financial district is in horrible condition[1], and there's "no funding" and the usual blarble about the Feds bailing us out. Is any other city program going to be cut a dime to fix the seawall? No. From a political standpoint it makes more sense to let the city flood than it does to cut off any short-term political gain.

[1]https://www.hoodline.com/2017/01/as-earthquake-threat-to-sea...


The Author should spend time in Chicago. We've got ridiculous taxes on everything, general sales tax itself is almost 11% now, and nothing in this city works. The trains barely function close to on time, the streets are riddled with potholes, and the police are afraid of PR problems so much crime is escalating.

Also, the weather sucks.


Chicago is fantastic. You should try a few other cities and report back.

Source: Moved out and now want to go back.


Really? I mean, if you have money and can make sure you're in specific areas of the city enjoying specific things, I can see it being great.

But a blanket statement like that about Chicago is very surprising given all of the negative things about its situation (unbelievable gun violence, for one).

Also I can never get over the hilariously bad 75-year lease of its parking meters where they got $1B up front in exchange for eschewing massive amounts of ongoing revenue (the lessees have made $650M+ in revenue in 6 years while sapping the populace dry). Now Chicago actually has to lose money and pay Abu Dhabi any time they want to shut down a street for maintenance or public festivals. Fantastic!


> (unbelievable gun violence, for one).

Gun violence does NOT happen in the "good" parts of the city to "good" people. I mean yes I am sure it happens. But the good parts of Chicago have being shot by a gun odds as lower or lower as any other major American city. (Look at say the Violent crime rate in Lincoln Park or Oldtown).

Deal with drugs or live in a bad area? Yes, not that safe. But I am lucky to avoid both of those.

Parking meter was total robbery. But Chicago has good income. The small town I moved to is like the city in this story. Too many roads that cannot be paid to maintain. My city literally has 0 debt. Not low debt, but like literally 0 debt. But he average salary is 35k, and each year the city crumbles a little bit more. Chicago with the average salary double that, even with debt, is a more attractive place to live.


That parking meter thing is terrible but it probably wouldn't have been politically feasible to raise parking fees to sane rates otherwise.

Also, I really like Chicago. It's a great place to live downtown (and unlike other NYC, Boston or SF, living downtown is not out of reach for people making under $100k).


What? I've lived in Chicago for about a year and its public transit is one thing it does right.


I've only been as a tourist, but I found the public transport to be among the better I've experienced in the US.


Yeah as a transplant Chicagoan I'd have to disagree, on balance our infrastructure is pretty good. CTA works well enough for me to commute daily for the last 8 years without owning a car, and they've been systematically replacing water and gas lines around my neighborhood for the past few years. Not to mention new green / public space investments like the 606, Maggie Daley Park, etc.

Road resurfacing is pretty good where I am, but oddly is largely a function of how effective the alderman for your ward is, since a lot of the cost typically comes out of their budget.

The real problems with the city IMHO are massive unfunded pension liabilities and a huge segregation problem between the north/south sides.


Maybe you should try moving to the north side of the city?



I think you may have misread the article; it seemed to me that blame was placed much more on political incentives to encourage outward, sparse suburban growth. The kind of growth which incurs enormously more infrastructure cost per capita. Not on accounting practices.


Although there may be an argument that suburbs would never have been built if the infrastructure costs were front-loaded (e.g. >5% property tax).


Spitballing here, but I assume once cities with large enough density, e.g. skyscrapers, reach past a certain threshold, the amount of revenue as a result of high density overtakes the cost of horizontal expansion. Highly concentrated cities, such as NYC/LA/Chicago, are examples of these.


A local municipality here, Mississauga (https://en.wikipedia.org/wiki/Mississauga) grew as a typical suburb does, relying on aggressive expansion of low-density housing, with a large portion of that growth being in the 1960-1990 era.

The city pursued an agenda of low taxes by leaning heavily on subsidies paid by large-scale developers directly to the city. This was something very much embodied by the city's long-time mayor McCallion (https://en.wikipedia.org/wiki/Hazel_McCallion) who was something of a titan in her time and governed over the city from 1978 to 2014 with little political opposition.

The idea was that they'd build infrastructure to last 30-40 years and then figure out what to do later. Not surprisingly "later" came around all too soon and they were left scrambling.

The mayor pivoted from producing more sprawl, which just doubles down on the problem, to inviting developers to densify portions of the city, building codos and office towers. Through development fees they'd try and work their way out of a jam without having to massively increase taxes for everyone.

It looks like this strategy has so far worked, but it's not without risk. It's dependent on passing the buck to the typically younger crowd that's buying condos. They're paying for sewer replacements in those older neighborhoods that apparently never paid their fair share in taxes. Who will bail them out when their time comes? Hopefully the increased density makes it more cost-effective to do that.

There's a number of things working in favor of the city, like they're close to Toronto, so the're an ideal commuter hub, plus the regional airport is there, so there's a large buisness hub built out around it. Without that tax base and proximity to another city they'd likely be doomed. Nobody would ever want condos there.

If you're looking for those cities, look for suburbs built near major US cities that can leverage their location. Any that are on their own are ultimately doomed unless they dramatically re-work how they plan their urban layout. Low-density housing will strangle a lot of small cities to death.

Honestly it should be illegal for municipalities to collect less in taxes than they need to maintain their infrastructure in the long haul. They should be factoring in 60-year replacement costs and collecting money towards that in the decades leading up to a major overhaul. A change in the accounting rules to include this sort of depreciation as an expense that must be balanced out with revneue could go one step towards that, factoring in replacement costs and so on.


I don't know which the cities are, but I'd guess they're among the wealthiest ones. Maybe you'll find this useful: https://en.wikipedia.org/wiki/List_of_U.S._metropolitan_area...


Probably wealthy + dense — I'd guess NYC, SF, Seattle, Chicago are on the list. Not sure what else would get up there.


According to this ranking, Chicago and NYC are at the bottom:

http://www.thefiscaltimes.com/2017/01/09/How-Strong-Are-Your...


The includes pensions, which are separate issues — the Strong Towns piece is basically looking at property tax annually / cost of maintaining infrastructure annually (and ignores deferments and the like)


I would be surprised, to be honest, if more than half the american population knew that a nuclear power plant literally cannot blow up like an atomic bomb. However, I can't find any information to support either of our gut feelings. All of the polling information is of the form "how much do you like nuclear" or "what is the best non-fossil fuel power source". I'd love to see answers to "why do you think nuclear energy is a bad idea" or something.


While it isn't proper poll, Veritasium's interviews[1] with the general public support the supposition that most people lack basic information about anything nuclear.

[1] https://www.youtube.com/watch?v=wQmnztyXwVA


Yes, yes, the best time to plant a tree was 20 years ago, the second best is today. Is nuclear any different? I am unaware of anything about nuclear energy that would have worked 20 years ago but is a bad idea to do today.


unlike 20 years ago, we're producing cheap solar panels faster than we can install them


We're not building grid-scale storage at that rate yet though


I read the wikipedia page and that "what is TRIZ" article, and I still don't understand what it is. At times, it sounds like an automated program (especially with statements like "More than three million patents have been analyzed to discover the patterns that predict breakthrough solutions to problems"). But at other points, it seems like a human-centric problem solving strategy, but without the strategy. It describes problems and then solutions without any discussion of the in-between.

Do you have experience with TRIZ? What "is it" to you?


They teach you this stuff in product design classes.

TRIZ is a way of breaking down an engineering design problem into the thing you want to change, and the thing you can't change (a "contradiction"), then resolving it. A "TRIZ Matrix" is a reference tool that suggests ways of resolving conflicts between common design parameters (strength, weight, durability, manufacturing tolerance, etc.) based on a number of principles that have been validated over the years, like "nesting" or "prior action". Over the years, 40 standard principles (and 39 parameters) have emerged. They all have somewhat cryptic, consultant-handbooky names but make sense when you see some examples[1].

E.g., you have a beam and you want to make it stronger, but can't make it any thicker. You consult your matrix for "strength" vs "area" and get some suggestions such as "use composite materials". Or, applying the principle more generally, you try to extract techniques from the patent library or publications that resolve the problem.

[1]: https://www.triz.co.uk/files/triz_40_inventive_principles_wi...


This reminds me of Brian Eno's "oblique strategies".


Thank you, this explanation made a lot more sense to me.


I had some exposure to it and a few classes more than a decade ago now, but it didn't really end up being anything that was a good fit for me and I never got into the group of folks trying to do TRIZ for software development.

Probably in part because of the background of the creator and the problem datasets used for the traditional contradiction matrix it always seemed to me to be a better fit for manufacturing and physical goods.

The "Interactive Contradiction Matrix Beta" linked at the top of the TRIZ Journal site may be worth looking at, but it's kind of cryptic. Basically you pick out a few areas of concern - as an example I picked (on both axes) 1: Weight of Moving Object, 9: Speed, 15: Duration of Action of Moving Object, and 27: Reliability. Based on that, the recommended areas that I should be looking at for possible improvement potential would be 35: Parameter Changes (turns up 6 times), 3: Local Quality (5 times), several others at 4 times, etc. Hitting the Analyze button on that tool will give expandable examples for the various areas - for example "Parameter changes" includes a lot of changes to temperature, state (solid/liquid/gas) and consistency. An example might be making liquid-filled chocolates - do you have to fill the chocolates? Can you have frozen chunks of filling that you coat with chocolate instead?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: