Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I am wondering if the NSF program officers responsible for overseeing the grants should be contacted as they would be in power to make the University and the PIs accountable and/or start their own investigations.

They can be easily found in grant webpages, eg https://www.nsf.gov/awardsearch/showAward?AWD_ID=1900713



The program officers were contacted, leading to the OIG being notified on 6 December 2019 and supposedly an investigation was openened, but we have not heard anything back since then.


I just want to thank you for your actions and ask if you have any advice for career paths to do something about this type of academic bullshit. My GF is currently pursuing a PhD in biochem but very interested in pursuing science policy so that she can do something about all the injustice one witnesses as a graduate student.

As someone who received both their B.S. and M.S. at UIUC, I'm honestly disgusted at how various professors and administration have handled certain incidents that my friends have been part of


there's really not much one can do. In Comp Arch, most of the honest people left the field years ago because they got sick of all this nonsense. Most were happy to let those left behind play their silly peer-review games, but when the student suicide happened it was like a call to action.

It is easier to fight things from the outside, as the internal higher-ups have less power to mess up your life.

It's interesting you mention UIUC, as traditionally that was the source of a lot of suspect papers. It was a running joke about the "coincidence" of how many people with UIUC connections managed to get papers into ISCA each year.


is this true? Can you expound on this? which suspect papers are there?

I know there are a lot of solid professors in UIUC like Joseph Torellas, Sarita Adve (C++ consistency models), Vikram Adve (who is the adviser of Chris Lattner - creator of LLVM). This is really bothersome and I hope it's not from one of those professors.


well let's take Torrellas. Check out the two papers at ISCA'20 with him as an author.

Both use modified "cycle-accurate" simulators for the results. For now let's ignore all the issues with the accuracy of these simulators.

Let's validate how "solid" the work is. Drop an e-mail to Torrellas and ask for the code they used so you can see if you can reproduce their work. Hopefully things have changed and they'll send you the code but in my experience they'll just say no.

So they got two papers in which are unverifiable, and none of the reviewers ever saw the code involved. This bothers some people, but it's not unusual at ISCA.


1. torellas student (now at mit) release code for spectre defense. yeah maybe it was to help her job hunt, but at least it proves (to me) that torellas isn't just pushing crap in the past. i understand you have issues with torellas handling of this incident (as do I), but taking issue with his isca papers seems to be overreach.

2. i dont understand how you can pick on "cycle accurate" simulation when every simulator in our community has problems. at least in gem5 (one of torellas isca2020 paper), we as reviewers can look at the code. how about the famous "in house x86 pin based simulator"? most pin based simulators performance numbers should rightfully be joke, but we use them anyway, because we aren't going to rewrite them.

3. at the end of the day, most of our work is unverifiable, because we make so many approximations anyway. one young faculty told me "we just need to see the idea and determine if it makes sense". i just do not know if this is the right thing to do, or if its a lie we tell ourselves.


> ewers ever saw the code involved. Thi

There are a lot of papers in ISCA that are based on cycle-accurate simulators. It has been like that since forever. How else would you evaluate new non-existent architectures that are bleeding-edge? FPGA? Most can't even afford that. Not to mention it's just not possible in a lot of cases. I agree with you that some work should be verifiable but your accusation is weak for this part. Maybe the community can push for verifiable work before getting published in ISCA since it is such a prestigious conference.

If you don't want cycle-accurate simulator based papers, you'd have to eliminate a large variety of body of work only possible with this approach. A lot of techniques in modern processors have seen their start in processor/system simulators and most of them are probably not cycle-accurate.


> If you don't want cycle-accurate simulator based papers, you'd have to eliminate a large variety of body of work only possible with this approach.

yes. Most "cycle-accurate" results are garbage. Do you show error bars on your results? Can you? Did you run on a variety of independently implemented simulators and show the results on all of them? Did you run the full reference inputs to SPEC all the way through? Why not?

The answer seems to be that it would be hard. But guess what, science is hard. Try complaining to a biologist sometime about your architecture paper that took so long to write, where you gathered all the results 2 weeks before the paper deadline.

It's fine if you come up with a new idea and run some simple proof-of-concept runs to show it might have merit, but don't pretend the results from an academic simulator hacked together by a sleep-deprived grad student have any real world merit.


omg running ref spec on gem5 fs mode...


can you expand? i know the clique, but at least for the specific work i follow, the work isn't bad recently.

of course, nothing surprises me.


it isn't always that the work is bad, it's just the review process for getting in has always seemed to be easier for people with the right connections. Not necessarily in an overt conspiratorial way like the recent allegations either.

The reason people have latched on to the current set of allegations is there seems to be actual concrete proof of misconduct that can be acted on.


> it's just the review process for getting in has always seemed to be easier for people with the right connections

What is this insinuation based on? Do you have data on how many papers by "people with the right connections" are rejected, for example?


it's anecdotal.

another anecdote, when first trying to raise awareness of the issue, when mentioning it to members of the community I never got the response "wow, how could this happen in computer architecture?" rather the response tended to be "wow, I can't believe it finally got bad enough that a student died"


Because these anecdotes are unsubstantiated, they can come across as sour grapes, which I think damages your good cause of exposing the serious misconduct for which there's actual evidence.


i think this is just anecdotal; this person is clearly in the community.

i'm not sure this data is useful, because we'd disagree on "people with the right connections".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: