I simply do not understand this anti-college anti-credentialling sentiment. I am featured in the latest Hacker Monthly ( print version of HN). So they asked me for a bio. I wrote something about how I learnt everything in college...am not a hacker...and that you should go to grad school if you want to get better at CS. So they edited out all of the pro-college stuff and just said this guy is data scientist.
You don't figure out Dirichlet allocation and principal components and matrix regularization hacking away in your garage. This stuff isn't going to occur in your mind out of the blue. Its fairly complicated and even those of us who were systematically taught these things at school take years to internalise it. Don't downplay education. You are missing out on a treasure trove of knowledge humanity has collated over centuries, just to hack away and reinvent the wheel by yourself...well, good luck with that.
Speaking only for myself, it comes from a combination of selection bias and general mistreatment at the hands of others.
When you see someone who has three degrees in various flavors of computer science (BS, MS, PhD) who doesn't know about what happens when you use "==" with two floats, it builds.
When your boss tells you that he won't pay you any more because you don't have a degree, it builds. When you find out he's paying the three network engineers 3x what you get and they don't even really understand TCP/IP (so they wind up coming to you for everything), it builds.
When you see someone who is hired as a sysadmin because they are clearly such a great computer scientist and can do anything, and then who can't manage the simplest of Unix maintenance requests, it builds.
Any time someone tries to use their degree as a club instead of a wall covering, it builds.
That's just me. I can't say why other people feel that way.
For the record, I only got my degree two years ago this month. I mostly did it because I really needed to "walk the stage" at last. Everything else was secondary. Now that I have it, does it make me any better than others? No way. If anything, it puts me behind the 8-ball having to pay off these stupid loans for the rest of my life.
> When your boss tells you that he won't pay you any more because you don't have a degree, it builds.
It's just a convenient excuse. If it wasn't a degree it'd be something else. The boss was negotiating, so negotiation skills are what's needed in this situation, rather than a degree.
You're quite right. I was too young and inexperienced to know it at the time. It took several years to realize just how much they were taking advantage of me.
Still, he said it, and it incremented that internal counter of "reasons to resent", even if it was unfounded.
It's best to ignore any objections and focus on getting paid a market rate (what other companies would pay). So something like "I'm worth $x". I find that the other side quickly stops raising objections and shifts to compromising on a $ figure. Still, jumping ship might be easier than getting a raise that's well beyond the rate of inflation.
wry smile. Nothing wrong with what you've said. But come to think of it, you've endorsed rachel's POV regarding the building up of resentment with this answer. If jumping ship is the kind of disruption one has to make for what amounts to "convenient excuses" (and the manager goes on to pay 3x more to "worthless" network engineers -- read her reply above), then I say, let the resentments boil over!
I meant that jumping ship might be easier at any company. Businesses put rigid limits on raises. If you want to make a market rate it's best to get it when hired.
I'm of two minds of this. 1) I'm very much NOT anti-college/anti-credentialing. 2) I don't see your average business application software developer having any need for a degree in CS.
I believe the anti-college/anti-credentialing stance is primarily how people are sold on degrees. People are told they will get better jobs and make more money if they get one. However, no one really takes the time to tell them that just getting your degree does not qualify you to immediately go get a job doing exactly what you want. Essentially, people have been oversold on what the degree actually gives them.
My belief is that a vast majority of the technology related jobs are the modern day equivalent of 'blue collar' work, more akin to tradesmen, such as electricians. If you read up the requirements for an electrician in the U.S. (a brief read of http://en.wikipedia.org/wiki/Electrician#United_States is good enough), it follows the same general trend we see with software development experience, even if the lines aren't as clearly drawn.
One caveat, I don't have a degree. I'm pretty close if I were to want to go back and finish it, but I didn't drop out by choice. I'm of the firm belief that it is possible, but significantly more difficult, to gain the same knowledge outside of a college or university, it's just harder to quantify the knowledge you have. I would also suggest everyone go to college and get an undergraduate degree in something that interests them, but focusing on core classes and general requirements, not their particular interest. Universities are great for two things, imparting a general base of knowledge and specialized knowledge. However, undergraduate degrees have shifted to focus on specialized knowledge at the expense of general knowledge.
One last note in my long-winded comment, I really appreciate how you state 'you should go to grad school if you want to get better at CS' rather than the usual defense of schools saying that it's absolutely needed.
> Essentially, people have been oversold on what the degree actually gives them.
The degree certificate itself, perhaps. But I think people also miss what can come along while getting the degree. Networking with senior faculty members. Participating in undergraduate research. Independent classes with those faculty members. Networking with peers of similar interests.
It's what the student makes it. As such, you're quite right that just "getting a degree" doesn't make you qualified for anything aside from being a degree holder. I think it's important to keep this in mind when discussing undergraduate degrees.
> ... vast majority of the technology related jobs are the modern day equivalent of 'blue collar' work ...
That's certainly how most of tech-workers are treated. However - should they? Even for non-degree bearers (or those like me transitioned from other fields), it's a highly creative non-mechanical work. The notion of treating tech workers as blue-collar ones is I think stemming from the black-box sophisticated nature of work (and perhaps introversion of many in the business) which the industry bike-shed into previously known moulds.
It didn't sound like "anti-college anti-credentialling" at all. All the OP was trying to say was that the traditional method of teaching ie. "all theory" didn't work for him, while a mixed hands-on/practical/theoretical approach did.
Your comment, while true, is fairly irrelevant in context of this article.
I'm the OP. I meant that the experience that you have matters, the degree doesn't.
Pursuing a degree gives you valuable experience - it would be a disaster if it didn't. There are other ways to get experience as well and some of those are better suited for certain individuals like myself. Our industry, when it works well, is meritocratic. If you are great at what you do the way you collected that experience is irrelevant.
I couldn't tell you who at Shopify has CS degrees and who doesn't. It simply never comes up. What comes up a lot is how good and how helpful people are and there seems to be little correlation with the degree.
The caveat to this is that having a CS degree does not guarantee that valuable experience was gained. There were people who stood next to me when I received my CS degree that were unable to craft basic HTML.
On the other hand, I enjoyed my entire time in college, and I feel that I received a great deal of valuable experience, but I also did a great deal of coding outside of school. I think in my case I would have been fine without getting a degree, but it helped to hone some of the areas that I was not very strong in.
We had a similar experience at my last company I worked for - except even worse. On average, applicants for the Data Science team that held degrees in CS/CIS/CE were on average FAR WORSE than ones who held degrees in other disciplines or simply did not have an undergraduate degree.
It got to the point where we heavily weighted applicants who studied Physics/Mathematics/Economics over the CS ones.
As my co-worker said: If someone in our city has a CS degree and needs a job, he probably really sucks at software development.
The real competitive advantage was hiring people with very good math/quant skills and teaching them software development concepts through Coursera and pair programming. We saved a lot on salary and we got very good developers.
I think that it's more useful to look at the whole paragraph that that quote comes from:
"Not that degrees matter anymore. They do not. Experience does. That is one of the things my apprenticeship and the dual education system in general taught me: experiencing and learning things quickly is the ultimate life skill. If you can do that, you can conjure up impossible situations for yourself over and over again and succeed."
This is only my opinion, but when I look at the OP's statement in that context, I think that he's arguing that a credential isn't as important as the ability to trust that you learn your way through hard problems as they come up. In that regard, I completely agree with him. I think we all know some people with great educations who, when faced with a seemingly insurmountable problem, will spend days finding literature to support the conclusion that the problem can't be solved. And conversely, we also know people with little education who look at insurmountable problems as fun projects. Personally, I suspect that this difference comes down to attitude and experience. On one hand, it takes a really great attitude to consistently stare down difficult problems. But on the other hand, I think that with experience comes confidence and solving really hard problems takes as much confidence as diligence.
That is, of course, all my opinion and there's a high probability that I'm wrong....
I'm definitely not one that thinks credentials / college matter for most software developers (or frankly, people in general) but setting that aside and looking at what options exist:
I think a big part of the problem is the irrationality of getting a degree in 'Computer Science' to be a programmer (aka 'Software Engineer'). My degree is in Electrical Engineering with a Computer Engineering focus and I would never have gone for an Electrical Physics degree. I wanted to build things, not be a research scientist.
So, why is the de facto degree for programmers Computer Science? How much do you actually learn about building software in a CS program? My experience indicates, not much. You learn a lot of computational science, great, most products have 1 or 2 difficult algorithms at their core, but they often represent an amazingly small % of overall development time, where is the training for everything else that needs to be done?
The biggest problem is that there seem to be no good Software Engineering programs. Software Engineering degrees do exist but they seem to be woefully outdated and only address a small subset of those people whom wish to 'build things' with software.
I agree. I feel as though my time spent in computer science courses in undergrad and, even more so, graduate school, helped me be a better programmer than I could have hoped than if I had spent those years just programming instead. It taught me to think differently.
Of course, this doesn't apply to everyone. A huge number of people skip computer science degrees and are still highly successful.
But one swallow does not a summer make, which seems to be the approach of many of these articles: Hey, look at me, I didn't go to college and I'm successful!
This seems to be an argument that is very difficult for either side to argue, because I don't see a way you could really look at a statistical analysis of the careers of CS grads versus non grads and make some kind of conclusion.
I think where the college and credentials bit is failing--for me at least--is the lack of differentiation between "learning to do" and "learning to think." I went to a liberal arts college and spent most of my four years learning to think. Liberal Arts are good for that kind of thing, and depending on your background (I'm not the product of highly educated parents) it can be extremely useful and a completely justified reason for going to college. One can learn to think elsewhere, but college is still a great place to do it.
As recently as ten years ago, it was impossible to learn to do outside of a formal training program, but today it's completely possible to learn to do almost anything via Youtube, iTunes, and the rest of the internet. And the problem is that today's university system--except in fairly rare cases--actually hinders learning to do. The world outside moves too fast for the educational bureaucracy to keep up. Formal education is valuable, but maybe more for learning to think than learning to do, at least in modern times.
>I simply do not understand this anti-college anti-credentialling sentiment
Speaking as an anti-college, anti-credentialist person, well, a lot of it, for me? is that I didn't go to college.
I mean, everyone likes to pretend to be altruistic, but in the end? we all see the world from our own perspective.
And really? most people seem to think that you go to school, you get a cs degree and you will be pretty good. And that's simply not true. I've hired people with CS degrees only to have to fire them, because it turns out, they didn't actually learn anything.
I mean, I've also worked with people who really did learn incredible things in school. Things, as you said, that I certainly have been unable to teach myself.
I also think a lot of this anti-college stuff is people who went in to debt for life, having been told they'd get useful job training and, you know "find yourself" (you know, the class thing. No matter how much money I earn? I'll never be middle class; in the eyes of most people, i'll never be a 'professional' a 'real person' unless I go get a degree. I'm the guy who fixes the pipes. Which, eh, I am mostly okay with at this point.) - but anyhow, yeah, these kids graduate and find that the market value of their degree is, well, pretty close to zero. (and yes, a lot of these kids got degrees in philosophy or fine arts... And yes, we laugh at them for expecting to turn those degrees into money. But they were 17, goddammit, when they made that decision. What kind of asshole expects a 17 year old to make good decisions, especially when they receive bad advice from every trusted figure they talk with? Then we go and say "unlike every other bad loan you will take in your life, the person giving you the loan takes no responsibility. You have to pay this one off. No bankruptcy.")
This idea that "you can do anything you set your mind to" also permeates education, and it's incredibly destructive; it's how we end up with those people with CS degrees who can't outcode me.
You don't learn to code in college, or at least not in class. Good coders almost universally learn outside the classroom. This is true for people who go to college and those that forgo it.
So if you don't go to college to learn to code, why go at all?
You go to learn the fundamentals of computer science and math which underlie everything that we do.
It's certainly possible to obtain this knowledge on your own with serious self-study. But it's a hard route (albeit one getting easier thanks to Coursera et al.) and given two candidates with equal coding ability, the one who went to college is far more likely to have a strong grip on theory.
Is theory actually useful? I would claim that it is, even for programmers doing CRUD apps and the like. Knowing the fundamental abstractions of computer science and the ways people have applied them in the past saves an enormous amount of needless reinvention. As a concrete example, parsing is a very well-studied problem in academia. But somebody without that background trying to write a parser will struggle far more than another familiar with CFGs or PEGs.
Being able to program is just scratching the surface of computer science. It's necessary but not sufficient for being a great programmer.
>So if you don't go to college to learn to code, why go at all? You go to learn the fundamentals of computer science and math which underlie everything that we do.
This is why my company hired Math/Physics graduates instead of CS graduates - they had far better command of advanced mathematics and came much cheaper than CS graduates (who were generally worse applicants anyway).
>It's certainly possible to obtain this knowledge on your own with serious self-study. But it's a hard route (albeit one getting easier thanks to Coursera et al.)
Now, what little math and theory I have learned, well, really, what anything I've learned... I've found that it's dramatically easier to learn from books than from a lecture. First, people talking? really, really slow. I ain't no speed-reader, but I can manage 400 words a minute, give or take, without skimming; of course, I can go much faster when I'm just skimming over the stuff I already know.
My usual strategy is to buy several books at the same level on the same subject; Preferably by very different authors. (I want at least one of the books to be a classic by the person who came up with the idea.) the idea being that each one is going to explain it in a slightly different way, and while I'm only going to retain a small portion of each book, well, books read fast, they don't cost much, and if I take away 10% of each book? I'm doing pretty okay.
(I mean, that said, my math and my theory is still pretty weak, compared to many of my educated peers who work at the same payscale, So I'm not saying my strategy is good or anything... it's just that I've never gotten much at all out of lectures. Payscale is also interesting... it seems to scale more with negotiation aggression than anything else.)
>Is theory actually useful? I would claim that it is, even for programmers doing CRUD apps and the like. Knowing the fundamental abstractions of computer science and the ways people have applied them in the past saves an enormous amount of needless reinvention. As a concrete example, parsing is a very well-studied problem in academia. But somebody without that background trying to write a parser will struggle far more than another familiar with CFGs or PEGs.
Why the fuck would you write your own parser for a CRUD app?
Seriously, that's whats wrong with the world.
I'm a SysAdmin. The way I see the industry? you hire a bunch of php monkeys to slap something together. customers don't like it. You pivot, come up with some other business idea, hire php monkeys to slap something together (usually using the same monkeys and the same code) you repeat until the market likes the output.
By this time? It's a giant hairball. It's disgusting. But you are getting users, and it's making money, so you've gotta scale.
Now, you could pay a bunch of 'real programmers' to write you something good, but that can take months or years even with competent management.
So what do you do? you hire someone like me. I show up and put a caching http proxy here, I put some indexes on the database server (and put it on a beefy server, and maybe even setup a read-only replica for the database) etc, etc.... i slap on duct tape until you get around to having a competent person re-write the whole mess. That's the "computer janitor" role.
So yeah. that's why I hate CRUD programmers trying to re-invent the fucking wheel. There are plenty of parsers out there. Use them.
This, this is why sysadmins hate NoSQL. seriously, in a few days I can make a crashy access db 1000x faster and more stable fairly easily by ODBCing the data out to a reasonable database. relational databases are incredibly easy to tune and improve, and I don't need to understand your giant PHP hairball that is full of global variables. USE A RDBMS. then you can do your job, and then if the thing takes off and you need to scale, I can come in and do mine.
That's sort of my point. People without a computer science education often lack the awareness of the gaps in their knowledge. So they don't set out to write a "parser," because they don't know that word.
They're faced with the problem of parsing some custom text format. Maybe they'll try some terrible regex hack (see: http://stackoverflow.com/a/1732454). Maybe they'll stumble onto recursive descent parsing. But if you've taken a class on compilers (or automata theory) you'll know when regular expressions are not up to a parsing task, and you'll know about parser generators and how to write a grammar for one.
How can these CRUD programmers avoid re-inventing the wheel if they've never been exposed to it?
"relational databases are incredibly easy to tune and improve"... only as long as they are on a single beefy server. When it is not enough, and very often it is not enough, things start to become very hairy up to the point where you have to rewrite your application using a proper NoSQL solution and with some people who understand databases a little better than "See ma, I put SQL here and it automagically evaluates..."
servers up to 512GiB ram, these days, are pretty cheap. that is /big/
The thing is? last I looked, none of the NoSQL databases handle multi-master replication, either. In all cases, if a server goes away, you have issues. With SQL, you promote your most up to date read-only replica to read-write master, re-point the other slaves and run. With modern SQL databases, you can even control how out of sync you allow your read-only replicas to be (to the point of making it so the transaction isn't complete until it's been written not only to the master, but also to the slave... of course, that costs time. It's a tradeoff you can tune, depending on how much your data matters.)
Now, if you just need more write-performance than a single beefy server with 512G ram and a bunch of SSDs can bring (which is a lot) then you shard. At that point, you are pretty much where most NoSQL systems are, except that most sql engines come with a bunch of neat tools for finding slow queries (and for figuring out how to make those queries fast without changing the php) that the nosql stuff seems to lack.
Author is simply trying to say that going to college was not for him since he is a kinesthetic learner and prefers learn by doing.
I believe he is advocating a dual system of education with a combination of college or apprenticeship since there are different types of learners out there. College is the 'only way out' for many people, otherwise you would be considered a failure and cannot find a job.
A dual system, if you wanted to just turn out programmers who know the basics of scripting, how to use a language, and build applications might work.
If the "dual" system were to include architecture, algorithms, discrete math, and deeper subjects that a typical CS curriculum usually covers, what's the point of an apprenticeship system if it is replicating academia?
I can see a need for a trades like class, but here in the US we already have an assortment of community colleges, certificate programs and the like that replicate that model.
Exactly. I feel like every topic that enters HN become a replication of some of the old HN discussions. The post is much more about creating an institution of educating people that is alternative to high school/college than about diminish the value of college. The author actually explicitly says that this is common in Germany.
And saying that personal experiences in college was important to oneself, so everybody should try is the same that saying that my personal exeperiences in playing competitive soccer were important to me, so everybody should try.
Unless there's money to burn a career should be treated like a business. For many people getting a CS degree would be a big net loss. One can certainly master Dirichlet allocation from the comfort of home, no fees required.
Is it intrinsic to this treasure trove that the knowledge it contains must be transmitted for pay, with exercises to be completed on a deadline, and with little choice on the learner's part as to what is learned?
Have you ever seriously considered PhD programs? No one that I've ever heard of has ever had to pay tuition to attend college for their PhD (myself included). It is usually a "work exchange for education"; a research or teaching assistantship while taking classes and writing a dissertation.
And to say that the learner does not get to decide what they will research for their dissertation is just flat-out wrong. It is assumed that the student will pick what to write their dissertation on - they need to study it in-depth for 2-4 years and picking something that they do not find interesting will most likely lead to dissatisfaction (but it is still a choice).
Granted, a PhD program requires a prerequisite BS degree or equivalent which is generally not paid for and follows a typical regime. However, not all knowledge is the same - highly specialized "brink of human understanding" type learning is paid, unstructured and definitely has no exercises to complete (since the one learning them has the most understanding of the topic).
Honestly, that sounds like paradise to me, but for some reason I wish I could fully identify and figure out how to beat, I've been unable to negotiate undergrad.
No, it mustn't be transmitted for pay. In Argentina, for example, public college is good and free.
More importantly to your question, teaching a hard curriculum to a lot of students at the same time is a hard problem, and exercises with deadlines and relatively tight syllabus are a method that has been shown, for many years, to work for a large number of people. It certainly doesn't work for everyone, it certainly isn't the best way to learn, but it's an effective tool for the problem at hand.
I wont argue that private college in the US isn't brokenly expensive for most mayors. I'm not even saying that college is a good idea for most of the population anywhere. What I am saying is that for some important fields of knowledge that it would be good to teach to some not small fraction of the population, college is a good solution; maybe the best that has been actually tested. Of course not going to college would be a great choice for a lot of people who are going there. But it is also the right choice for a large number of people.
>I wont argue that private college in the US isn't brokenly expensive for most mayors.
The University of Washington is $13,000 per year for residents, who are preferentially screened and rejected because tuition for in-state residents is too cheap.
UW costs $30,000/year for out-of-state residents who are more likely to be admitted since it's more than twice the cost.
By the way, the University of Washington is a public school. Private schools in the area start at $45,000/year.
There is literally nothing you can study in undergraduate school that would be worth that price. Not even close.
UW is overwhelmed with demand. They've set up satellite campuses in Tacoma and Bothell to handle the overflow, especially for business courses.
When I went there in 2003-2007 it was still a deal for the quality of education, and my degree was definitely worth what I paid for it (about $5,000-6,000/year) but the price has more than doubled since then.
If I had to choose a university at current prices I'd probably go to Western or WSU instead.
I think $13,000/year is absolutely reasonable at UW. However, you have to be exceedingly intelligent to get in there - which most are obviously not. They screen in-state residents because they don't get enough tuition from them, so the standards are very high if you live in WA state.
So I question whether or not the University of Washington is really a public institute in that regard.
Let's not even talk about what it costs to go to Seattle University or Gonzaga.
I'd argue that it has not been shown to be effective. We're reaping the fruits of that today: grade inflation, disinterested professors, and new graduates without the jobs that served as the carrots to get them into school in the first place.
College is expensive, and if you need to take out a loan to go there (most do) it is risky. Not only that, but credentialism (the idea that college exists in order to get credentials) coupled with the enormous expense has led to declining standards and quality at American schools. It is not unreasonable to ask why one should be paying so much and taking on such a large risk for an education that is of questionable quality, particularly in fields where vocational skills can be learned elsewhere.
In many engineering cultures, employees are perpetually required to play the role of pupil as a form of submission despite credentials, education, and experience. This is where all the downplaying of college and credentials stem from. There is almost no corporate culture which does not require new employees to play a submissive role to whoever is already there. Find one, and you'll maybe find people who value college and credentials.
The point isn't that you shouldn't learn latent dirichlet allocation, but that the best way to learn it is to come upon a problem that warrants it and then have a mentor point this fact out to you, motivating you to learn it so you can expand your capabilities.
There are plenty of engineers who love to pour over academic research of the bleeding edge who have no problems to solve with it, yet when you ask them to hack together a prototype of an idea they don't know where to start. I think there is a clear personality type that prefers to continue reading before doing, since there is an ever-present fear that just around the corner there is some kernel of knowledge that will remove the need for a massive body of work. In reality, this is almost never the case, and elbow grease and experimentation turns out to be the solution to many problems, not digging through libraries, frameworks, algorithms, and academic research.
There is large category of jobs that don't require complex math or a good understanding of algorithms, yet we treat them like they are 4 year degrees. This is bad, because it wastes a lot of time and causes a lot of pain.
What I don't understand is this anti-learning-from-books and anti-learning-from-computers sentiment. In many cases and for many people learning in nonconventional ways is much more efficient then going to school.
Most of the knowledge you will ever need you can attain for free on Internet with MOOCs.
And some of the more advanced concepts that you mentioned are not beneficial to know for majority of things you will do in day-to-day work, although I bet you can still learn a lot of it for free with online e-books.
If you want to do research on some really advanced topics then a degree is a must, but an average (high-)school education is a complete waste of time.
That knowledge didn't just appear once there was an internet; it had to come from somewhere. You're able to make geo-local picture sharing apps inside of instant messengers because someone with an advanced degree figured out how to make transistors. Then some other person with an advanced degree figured out how to make combinations of those transistors perform coherent operations to achieve complex goals. Then some other person with an advanced degree distilled the mathematical theories required to parse languages. And so on.
It's fine to say that you're getting by just fine with stuff you learned on the internet, but to call education a waste is to declare that what we have now to be sufficient and there's no point in learning anything new. That might be working for the Amish, but it doesn't work for me.
You are strawmanning. I wasn't claiming that all the knowledge that comes from people with advanced degrees is useless.
Most people don't need an advanced degree to contribute to society. Proportionally, only a small number people are needed to make transistors and develop mathematical concepts for computers compared to the amount of people using their work.
While it is necessary to have people with PhDs conducting research to have progress, hackers and doers (which base their work on the research) are also needed to exploit the results of research.
It is also possible to not have a degree and still succeed in life, and do things that you love.
Moreover, I think that nowadays it is becoming easier to get knowledge (that traditionally was only accessible in Universities) for free on Internet. I think traditional education system is extremely limiting and archaic. It permits people which excel in only certain kind of environments to conduct research.
Most people reading this site are not doing machine learning or advanced technical work. The day to day work of the vast majority of programmers does not require completion of advanced formal education (or the self taught equivalent).
Even then it is not actually that hard to learn about principal components or matrix regularization if you have some mathematical background and the desire to teach yourself...
That's not the tone I got from the article at all. Your comment seems incredibly defensive and largely blinded by your own experience, how is your view any more valid than his? You must have never encountered any of the numerous CS graduates that can't code their way out of a paper bag.
I wouldn't characterize his post as anti-college in the least. It's cheap to say that Tobi's post espouses only one viewpoint; rather, he's giving an anecdote that suggests there's more than one way to do it when it comes to a career.
You could spend years in college or you could just start trying to solve problems collaboratively that lend themselves to principal components analysis. You'll learn PCA much faster via the later approach.