> I had another technical career previous to this one
This seems to be the huge differentiator.
I've had really good experiences with boot camp grads who had a previous technical career or came from a STEM phd. Camps are great for getting already well-educated people up to speed on how to hack together a website, which is an enormous value add to an existing skill set.
Absolutely, familiarity with troubleshooting, how layers of technology interact, dealing with new tech regularly, working with technical teams is a big leg up.
> It sure would be nice if a college education was about learning and growth and discovering oneself and all the hoopla they feed you during visits and orientation, but it's a structured credentialing system that involves professors for some reason, and it's effectively mandatory for making an above-median wage.
College is what you make of it. It's 4 years to spend at least half of your waking hours learning new things. You can waste a bunch of that time and still get a credential with vanishing return, or you can invest all of that time in building highly valuable skills (while also getting the credential).
N=1: I was offered a $70K coding job out of high school. I made over 4x that amount at my first software job after graduation. They definitely weren't paying 4x more because of my degree, and most of my graduating class was making less than 100K. But it would have taken me longer than 4 years to get those skills and knowledge if I were working full time. More importantly, I would have had very few options a few years later when I got bored/burned-out with programming and decided to move into research.
College is a space to develop your human capital. The official curriculum is there as a guide and baseline, but it's your responsibility to use your course project time and especially free time in an effective way to build on top of that curriculum. Unfortunately, very few students understand this.
> Indeed, professional mathematicians are usually the ones teaching prospective high school teachers during undergrad, so we are the ones to solve it.
The vast majority of people who are paid to use their knowledge of mathematics aren't doing math research. In fact, mathematics research is so uneconomical that most mathematics researchers spend 20%+ of their job on teaching.
The number of people who are paid a full 12 (or even 9) month salary to do mathematics research is vanishingly small.
> This problem is accessible to anyone who knows basic high school algebra, but it requires a lot of effort
That effort is really difficult to get out of students because few students have intrinsic motivation, and there's very little extrinsic motivation (unless 3 post-docs is your idea of a good way to spend your 30s...)
I'm not sure what the solution is, but showing more people what mathematics researchers in universities do with their non-teaching time seems like the wrong solution.
What I described is what I consider to be "doing mathematics". As distinct from applying mathematics to other problems (also a very worthwhile endeavor!)
I'm certainly not encouraging everyone to pursue a career as a math professor; the job market is poor, and is getting bleaker.
But I feel like prospective math high school teachers should have an experience like the one I described. Much more important, in my opinion, than learning how to row reduce matrices, do integration by parts, or prove the intermediate value theorem.
> Where I live in Maryland, you can do two years at the local community college (which is excellent) for $8,000 in tuition and fees, before finishing up at a four year-college.
This is true for the majority of students, but not generally true for the top ~1-10% of students who often need all 4 years to compete for slots at top law schools, med schools, phd programs, and employers. And getting into the top 4-5 of any of those things (except maybe phd programs, outside of a few fields like CS) will pay itself back in spades.
I'd be curious to see a study that figures out who thinks they are in that 1%-10% vs. who actually is. I.e., are people just being idiots, or are they over-estimating the trajectory of their early career?
Also, even for the other 90%, the same is true for lots of stuff. Including housing and vehicles. People aren't rational.
That explains why a small elite who likely don't need assistance with loans don't avail themselves of community college, but doesn't say much about the general case.
Not sure doing the first two at community college impacts post-graduate admissions much?
If an applicant begins his or her undergraduate education at a community college, excels academically, transfers to a four-year institution, and continues an upward trend by maintaining an excellent GPA, scoring well on the MCAT, and demonstrating a proclivity toward patient care and research, their educational path can be seen as an asset.
The executive order doesn't actually do anything about debt.
It just requests reports from various agencies, which will then be used to help shape policy in 2020 when the Higher Education Act is reauthorized. Similar reports have been requested by previous administrations, to little effect.
The EO was largely about free speech at research universities, which is extremely narrow and ignores the really problematic institutions both in terms of free speech and in terms of student loan debt.
The free speech rules apply to places like UC Berkeley and UIUC (who are already obligated to follow 1A, and who the courts are adequately dealing with when the don't) but not places like Wheaton College and Cedarville University (both of whom will fire tenured faculty for not being Christian or Republican enough, and expel students for being unapologetically homosexual).
Because the oversight is targeted at research institutions, it also completely ignores for-profit institutions where 50% of students fall into default as well as non-research private institutions which are often far more expensive than their public counter-parts.
Both are by design -- requiring free speech and financial accountability from any recipient of federal student loan money would put lots of conservative christian colleges in very hot water...
To pile on, predictions like this should round to the nearest nice number within the error bars. Saying "49%" implies you have enough knowledge to distinguish between "49%" and "50%" (in this case, there isn't even any quantitative analysis -- it's just a made up number, so they might as well say "49.44232115%" if they're not going to say "50%").
Yeah. Sometimes we use "50/50" as a euphemism when we really have no idea. 49% sounds like "I have no idea but really want everyone to think I've given this a lot of careful thought."
If I had to be charitable, maybe Diamond believes a lone prediction at 50% is actually meaningless, so he always nudges to one side or the other so people can at least score him a little. No idea though.
Some remote work companies don't even allow you to work remotely from a location outside their home country.
So "foreign national on foreign soil" is VERY different from "remote"!
Moving development offshore is often not what companies have in mind when they say "remote", and for good reasons. Taxes, travel expenses, timezones, risk profile, etc. are more complicated. Especially the risk profile. And that's before considering more nebulous stuff like communication skills and "culture fit". At the very least, it's a lot of unknowns.
So, make sure you aren't wasting energy applying for positions that were never open to international (or non EU in the case of Europe) applicants in the first place.
I can relate to this. I’m not from India but from that part of the world and have been in the same situation as the OP. Many companies who advertise them as ‘globally distributed’ are simply not willing to hire someone who does not have proper time zone overlap. Plus the logistical challenges involved makes them more reluctant.
Also, you have to keep in mind remote companies get more applicants per job posting than a non-remote one. Unless your application truly stands out it’s unlikely that you’ll get through the initial screening.
Already filtering that out. Unfortunately most people don't seem differentiate between the two clearly. So I generally end up writing a mail asking about their geographic hiring constraints before applying for the job.
Also, when I say I get very less interview calls for whatever jobs I apply to, it is for companies which are looking to hire from other countries.
The article focused on confusion among students learning the language.
First, imperative programming is important to learn. Understanding the stack/heap and calling conventions is important. Yes, those things are hard to learn. But once you learn them, they are not that confusing. These are significant pedagogical concerns, but I'm not sure they are so important beyond the classroom.
Second, FP is also confusing to new programmers but for different reasons. E.g. anything that requires folds is typically easier in imperative settings.
I agree that imperative programming is important to learn but java is in my opinion shouldn't be the language to do so. I think learning should be fun and java is the exact opposite of fun. It's wordy, full of boilerplate and enterprise code. Java, in contrast to other major languages, doesn't have an inspiring narative or a reason to be.
I used to feel that java was a good teaching language, but after using it at work, I totally agree.
Java is excellent for enterprise distributed systems. It's got excellent tooling and an absurd quantity of libraries.
It's not good for simple programs to teach students compared to say...python or even C++. Especially things that don't really NEED the complex machinery built in to the JVM/Language spec.
For imperative programming, I think it makes sense to teach C and then also at least one object-oriented garbage collected language. Doesn't have to be Java.
I also think teaching some course in Java is somewhat reasonable, if only because Java is so pervasive and so important in the history of programming that it's good for properly trained software engineers to have seen it.
Having recently taken a few courses in Java, learning things like data structures can be fun, but it depends a lot on the instructor and the exercises provided.
IMO for someone new to programming imperative it is much simpler, like to this then do that. I also think programmers should understand eventually(not in the beginning) how things actually work, like there is a stack and a heap , external libraries are linked or dinamically loaded etc, otherwise you can hit some error like "out of stack space" and the dev has no idea what the hell is a stack and why recursion could cause this error.
It was six years ago. Try again, maybe? Clojure is slowly but steadily growing. The world is never "ready." It was never ready for Clojure, OCaml, for Smalltalk, for Haskell, for Purescript, for Elm, for Erlang, for Elixir - yet teams are using them, there are conferences all around the world, libs are being written, podcasts recorded. There are jobs. I started writing Clojurescript 3 years ago, and since then I can't find a better alternative for myself - a language that is well suited to build awesome things and makes me happy. I have changed 3 different jobs since then and never settled for anything but Clojure. Never had a problem with finding a job.
- Do not tie the applicant to a particular employer, and
- in addition to min salary requirements, also require independent demonstration of in-demand skills. Otherwise this system is too easy to game for large employers/sectors, who could do the equivalent of dumping by temporarily paying high salaries and then ratcheting down once they anticipate the market will clear at their desired price.
So basically, turn the H1B into an O-1 style visa where you don't need to be as extraordinary as long as you're well-compensated.
Which was exactly the intent of the H1B in the first place!
> Citizenship granted after ... 3 years?
I like the intent, but that's pretty aggressive. Permanent residency after 3 years is maybe a better way of phrasing that.
I'm cool with bringing people over to work if there is a real need. I really just want it to have an upfront cost that is tied to the need... some way of proving that need and skills would be nice.
I'd be ok with permanent residency, anything that disallows the employer able to kick you back home at a whim kind of threats.
This seems to be the huge differentiator.
I've had really good experiences with boot camp grads who had a previous technical career or came from a STEM phd. Camps are great for getting already well-educated people up to speed on how to hack together a website, which is an enormous value add to an existing skill set.