Hacker Newsnew | past | comments | ask | show | jobs | submit | moggie's commentslogin

That or configure the browser's default CSS rules.


What merits that the writer cater to you?


This struck me as odd:

“For my grandchildren, the idea that reading is something you do by yourself will seem arcane,” he says. “Why would you want to read by yourself if you can have access to the ideas of others you know and trust, or to the insights of people from all over the world?”

Though what's authored in a book might not be from anyone we know and trust (in itself not a "bad" thing) it's quite likely still a body of (possibly collective) insight already—perhaps even from someone(s) in less-local parts of the world.

Maybe I missed the point, but it felt like the quote was missing that a book is already a view into others' ideas and insights.

That said, even if printed books are to fall out of favor with the majority of literate people, it could still be a good idea to maintain physical libraries of works that many would find essential in the (perhaps unlikely) case that digital records fail or become inconvenient to access.


That quote was strange to me too, but for another reason, reading is a solitude activity, the point is to think for yourself.


I'm not super interested in the thoughts of those I "know and trust" as I'm reading (since a circle of real friends would be too small and too unlikely to have all read the same things for this to be effective, I'm assuming this is the social-network definition of "know and trust", which is to say "follow on Twitter", so that's an even stronger no) but high-quality, extensive commentary and annotation by multiple experts would be great. Many books have some of this in the form of an introduction and some footnotes/endnotes, but I'd love the ability to turn on much higher levels of this for, say, second readings.

Unfortunately it doesn't seem like ebooks are good at handling even basic annotations—certainly no better than dead-tree books—so I'm not seeing that happening any time soon. Plus if it ever happens it'll probably be some stupid online service, which I don't want. I want it to be part of the book, like DVD commentary tracks, or at least a downloadable add-on file of some kind that sticks around as long as I want it and can be backed up.


Historically reading was kind of a shared experience. Before press "scribes who copied manuscripts often made marginal annotations that then circulated with the manuscripts and were thus shared with the community; sometimes annotations were copied over to new versions when such manuscripts were later recopied" I'm quoting https://en.wikipedia.org/wiki/Text_annotation , references 2 and 3. Interestingly reference 3 is this paper http://jbt.sagepub.com/content/15/3/333 (paywalled PDF, sorry) about the future of annotations.


You just have.


The language is a bit rough, but I'm curious to see where things go.


What's to stop—not slow down, but stop some one or ones from, so to speak, turning the tide with regard to RSS's adoption by the tech-indifferent?

What mindset was there to begin with prior to the shift to Twitter, et.al?

Also, please, will you elaborate on the shift?


RSS won't be adopted by the tech-indifferent because it's a technology. It's not a service or a company and therefore doesn't have anyone with a vested financial interest pushing it as a platform. Also, it doesn't fit into the advertising-funded clickbaity world. It's therefore only adopted by people who think about what they are doing and what they want out of their internet, rather than people following the shiny path of least resistance and most popularity.


I'm in agreement.

(My mistake on using adopt when it was probably better to say "use".)


What are those issues?


Largely a focus on maximizing short-term and near-term "productivity" - as we understand it here and now - rather than allowing the leisure for both basic sanity and for larger thoughts.


I was about to write a longer post, but this is so incredulously complicated, and I am so biased, that I changed my mind halfway through. I'm honestly too biased about some of its points to provide anything marginally better than flamebait if I start going into details, but I can offer some hints.

1. The funding for research is increasingly scarce, and the ones who decide how big they are and to whom they should go are increasingly clueless. This means that ideas are funded less and less based on their scientific merit, and that more and more funds are handed over based on personal relations than on true selection.

This is an incredibly important problem. The retards running it are (like most retards) too arrogant to realize that we, as a race, suck at making predictions about what we can do. Both in terms of being too optimistic (remember thirty years ago when people thought we'd write these posts from a base on the Moon?) and in terms of missing obvious things, like electricity, which was intensely studied for several decades before it got any reasonable application.

There are many ways to avoid handing money over to crackpots. The focus should be on those, not on trying to judge the merit an idea might have in practice. We're too stupid to do that yet.

2. The bulk of the work is carried out by underpaid people, in bad working conditions. This makes smart people bitter, kills their productivity (in the short term) and their lives (in the long term). I distinctly remember my first workplace after I dropped out: it was a start-up where we were routinely pulling 60-hour weeks before we managed to hire enough people. This felt like a lot of free time to me. 60 hours is a free week for a PhD working on something worth a fuck. 120 hour stints are rare because of the physical strain they put on you, but you end up doing one every two months or so. Don't get me started on what that does to your life.

This may work if you have a bunch of Steve Jobs wannabe pretending they're scientists, researching how to sell things. Turns out, it's disastrous when trying to do real work. A lot of the jokes we played in the office revolved around that. They stopped being funny when we pondered how many drowsy PhDs were simply too tired to realize they could do <this thing> and bring us five years closer to treating cancer.

3. A lot of the undergraduate classes are becoming less and less fundamentals-focused, because academia is increasingly becoming the place where you're trained to work in the industry. Consequently, as people finish their senior years, they are increasingly less adept at research -- though, sadly, increasingly proud about their can-do attitude and so happy they have a Computer Science degree to prove that they can design websites, as if you actually had to fucking go to college for that.

Turns out these people are good enough for the wonderful world of startups. They're just as clueless about the real world as the people who pay them, their self-esteem is bad enough, and they are so utterly inept at learning anything mildly complex that they are easily sucked into the industry vortex. This isn't true only about CS and CompEng, it's happening in fields like EE and Mechanical Engineering, but the proportion of mission-critical applications in those fields is slightly higher, so most of the hipsters get cured after their first months on the job.

Turns out this system, while being almost satisfactory for the industry, breeds very bad researchers. Turns out it's hard to do research into web communication if 90% of your graduates have mad CSS skillz but still have trouble explaining what O(log n) means.

#1 and #3 above may be subjective, and there are a lot more points that I didn't want to bring in so as to maximize the chances of keeping the discussion civil. #2, on the other hand, while also being something I am quite subjective about (actually, it's the one I'm rather heartbroken about) is the one that escapes the scrutiny of people who are otherwise quick to cry out against the likes of Walmart.

Part of my dropping out of academia (see below for disclosure) is that I'm really not that smart. I'm good, but not scholarly material; I'm good at bringing together various technologies and finding atypical solutions to practical problems, but I suck at pounding on the same important issue for years and years at a time. I also suck at math. My brain isn't wired correctly for that. So, even if the working conditions hadn't been the way they were, I'd have left academia at some point, simply due to my sheer incompetence.

However, I had a lot of colleagues who were incredibly smart. People who could think of a problems in way I could never have possibly thought, and who were genuinely better at what they studied (electromagnetism) than I'd ever have been (I ended up doing research in EE by sheer chance anyway; I got into EE because I thought it would make be a better programmer -- which it did).

You wouldn't believe the way they changed in five or six years of working 10-12 hours a day, not only weekdays, but weekends too. I've heard more stories of wrecked relationships than I can think about, and too many of them are struggling with depression.

Most of them don't regret it at all. They're happy with what they discovered and genuinely feel they made the world a better place, but having been there, I honestly can't help wondering if it was worth it. I never bring it up with them, for obvious reasons, but it's very sad, especially since I can relate with that.

(Almost) full disclosure: I am an academia dropout. I dropped out during my MSc studies, despite being on a fairly good track (several articles, in several important journals (important as in "people who aren't scientists have heard of them") with my name on it, prior to me earning my BSc).


Oh God. You just described my life last year.

I'm taking a bunch less coursework this semester, and then I find out I need 30 points coursework to finish my MSc instead of 18. I would have been done with 18 this semester. I have to publish a thesis either way; it's a matter of whether they accept a non-Technion four-year-degree as a four-year degree or a three-year degree. I hate these anal, bureaucratic requirements; I just want to concentrate on research and in-depth issues rather than continually taking courses!

I just want to add one more issue:

4) In both teaching/coursework and research, academia is extremely detail-obsessed, constantly burrowing away from larger, important questions towards small, easily-answerable ones. For a good example, look at how many different kinds of differential equations the average engineering major at a really good university is required to learn to solve, and then check how often they actually solve those equations in either original research or in their jobs. Sometimes, yes, they do, but enough to justify having two or three distinct courses in just differential equations versus, say, a single full course in fundamental statistics? Oh, but there are a thousand different approaches to statistics!

The result is a system that, seen from the outside, appears to be trying to actively avoid tackling truly major scientific problems. Sure, it can give you a seminar on the latest approach to convex optimization problems or pure subtyping theories, but ask us what problems these solve and we academics will look at you sort of blankly.


> In both teaching/coursework and research, academia is extremely detail-obsessed, constantly burrowing away from larger, important questions towards small, easily-answerable ones.

In the research circles, this is simply because smaller, easily-answerable questions are the ones that fit the short-term grant applications. It's very unfortunate, indeed.

In the teaching circles, there is a slightly related case, which manifests through this:

> For a good example, look at how many different kinds of differential equations the average engineering major at a really good university is required to learn to solve, and then check how often they actually solve those equations in either original research or in their jobs.

This is because the mathematics courses need to reach a compromise between teaching enough fundamentals to be meaningful as math courses, and enough "practical" applications to warrant their presence in an engineering curricula.

I was also annoyed by taking math courses for three. Fucking. Semesters. But looking back to it, while I have forgotten much of the actual details they taught me, the type of reasoning they taught me stuck with me, and it's ok. I'm not sure if there's a better way to teach that.

> Oh God. You just described my life last year.

I hope it gets better. Cheers!


For #3, I am starting to think we should split off a portion of computer science and call it software engineering. People who are interested in the theory of algorithms, etc can do computer science. Those that want to learn how to build fault-tolerant, scalable systems can do that in software engineering. It is similar to how chemistry and chemical engineering are split.

Of course you still need to get a basic understanding of how computers work with a software engineering degree, but knowing how to mathematical prove some algorithm is O(log n) is pretty pointless for most (not all) jobs in industry.

One of the things that is not taught well across the board in computer science programs is how to actually write code that is readable and maintainable and how to work with a team using source control, bug trackers, etc. Just to be clear, I am sure there are programs that do this well, but based on my experience they are not the norm.


> Of course you still need to get a basic understanding of how computers work with a software engineering degree, but knowing how to mathematical prove some algorithm is O(log n) is pretty pointless for most (not all) jobs in industry.

I agree, but IMHO most jobs in the industry that don't require you to prove that an algorithm is O(log n) shouldn't need a degree at all.

In the area of the world where I live, if you want to be an electrician, you can do that after you finish high school. You have to take a course and get a certificate for it as a legal requirement (which is true of any profession where you can get people killed) but it's pretty straightforward, and the course only involves pretty basic stuff.

And don't look down on electricians. My degree is in EE and when I need some work done on the installation in my house, I call an electrician. I could do what they do, but sloppily and with far more dangerous results.


No, you need a degree for most CS jobs. CS is too complicated. You need to know how the computer works on a theoretical level which requires a degree.

The CS equivalent to an electrician would be people who do basic IT support (i.e. tell someone to reboot their Windows machine) and easy programming like creating a blog.


Do you? Half of my colleagues who do web development would be utterly unable to tell you what TLB or virtual memory are, probably don't remember Ohm's law, and while they would probably be able to write a sorting function on their own, chances are they'd fail even the most basic exam on data structures and algorithm.

And they can do their job just fine. This isn't 1995, when writing dynamic websites was a pioneer's job. What they do is enough of a commodity that it can be outsourced to college freshmen at the end of the world. People don't need a degree to do that now, just like they didn't need one back then, only for different reasons.

Also, no, doing electrician's work is far more complex than IT support. For one thing, the chances of dying because of doing it improperly are quite disproportionate, and the amount of theoretical knowledge you need in order to be an electrician is not negligible at all. Your average hipster who's proud of the amazingly cool things he's hacking on his Arduino knows a lot less about it than your average electrician, even if the electrician isn't so obnoxious about it.


The software industry does not revolve around web development. Yes it is a large percentage but it is not the only type of software, and many web development jobs require understanding data structures and algorithms. As always, it depends.


I don't see how this contradicts what I said. There are tons of jobs in the software industry that don't require anything so advanced that you'd need a degree for. The JS hipsterisms are a prime example of those, but they aren't necessarily the only ones.


You seem to, maybe fairly, denigrate startups and industry in general as well as academia.

What should an intelligent person do? What did you settle on? What advice would you give to young people?


> You seem to, maybe fairly, denigrate startups and industry in general as well as academia.

Yeah, this is exactly why I changed my mind halfway through that long post :-).

I honestly love the startup environment. I'd much rather work in start-up than a large company. What I do denigrate is starting up for the sake of starting up. Finding an (often otherwise legitimate) need, pounding at it for six months while it's still hot, and coming up with a technologically half-assed product to sell for a reasonable price.

This is, in my opinion, destructive both intellectually and technologically. It teaches bad habits and gives little time to learn both adequate technologies and the fundamentals of their trade to programmers. I'd be a very rich man if I had a penny every time I told a colleague who was enthusiastic about a new technology he'd discovered on kickstarter or here on HN that <this operating system from the 70s/80s/90s> or <this thing from the 70s/80s/90s> had this. You know something is wrong when so many new things are so similar to old things -- so similar to the point that they repeat the mistakes.

As for academia, I still keep in touch with some of the people I worked with there. It's the most intellectually stimulating place I've ever worked in, and contrary to popular belief, one of the most refreshing feelings a professional can have is walk into a roomfull of people and realize they're the dumbest in there. I miss being the dumbest man in the room.

> What should an intelligent person do?

I honestly have no idea.

> What did you settle on?

I didn't settle yet, but if I were to look back, I'd say what I'm doing now is better than what I had back then.

Now I'm working for a large-ish company. Their business is entirely software, but they want to start doing hardware and they brought me in to help. They pay me well and I can come in the office at 10 AM, which is good (I have some sleep issues). The work itself is shit; there's a long ladder of managers who are increasingly clueless about what embedded development means, but each of them has to deliver results (no matter how irrelevant) because they made promises. Consequently, most of what I do is pointless, but not entirely uninteresting. In the last month or so, I dabbled in a USB driver in the Linux kernel, hacked on an HTTP proxy, helped a colleague build a PCB... it's useless, but not entirely devoid of fun. I intend to leave as soon as someplace where they actually need me to build stuff, not massage egos, shows up, but until then, I can bear it.

That being said, it also leaves me enough free time to do some hacking of my own, enough time for my hobbies, enough time for family and friends. I have enough time to brew beer with my girlfriend, read whatever books I want, learn Go and post crap on Hackernews. I'm far more unhappy with what I do than I was in university, but overall, I'm a happy person.


When you think "startup," what comes to mind? And for "Small Business"?


PG put it best in his essay "Startup = Growth"[1]:

"Let's start with a distinction that should be obvious but is often overlooked: not every newly founded company is a startup. Millions of companies are started every year in the US. Only a tiny fraction are startups. Most are service businesses—restaurants, barbershops, plumbers, and so on. These are not startups, except in a few unusual cases. A barbershop isn't designed to grow fast. Whereas a search engine, for example, is."

A barbershop is a great example of a small business. A barbershop is obviously different in nature from a search engine because it's not designed for rapid growth.

[1] http://paulgraham.com/growth.html


I agree but even this definition feels a bit off to me. If, say, you start a new supermarket chain that is designed to grow rapidly, I wouldn't consider it a startup unless there is some kind of innovation in the business model. That may just be me, but I consider a startup to be a business that is designed for rapid growth and that is doing something new in one way or another.


I'm no expert on the matter, but to me a startup is trying to get big fast, whereas a small business is just trying to make enough money to live comfortably.


A startup nowadays is a business entity that tries to maximize value in the short-term (i.e. flipping), while a small business is a business entity that maximizes value for the long-term.

The problem with long-term stories is that they're usually boring until the business sells or dies.


Just a nitpick:

"Stop struggling to find products and let Rockerbox give you hand."

You're missing an "a" between "you" and "hand."


Thanks! Fixed.


How is the exploitable amount of calories determined, exactly? Is there a layperson-readable resource somewhere that you might recommend?


I'm not sure what exactly OP is referring to, but there is a difference in the nutritional data of, for example, raw chicken breast (which is marked on the packaging) and baked chicken breast (which is what gets digested [unless you actually eat it raw]).

See http://nutritiondata.self.com/facts/poultry-products/701/2 VS http://nutritiondata.self.com/facts/poultry-products/703/2

Make sure you line up the serving sizes.


Self.com has great data tables for many variations of foods.


It's not, for the most part.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: