There is zero, absolutely zero chance of the 50th percentile IQ becoming a world class mathematician. People who say this have no idea exactly how smart these guys are.
It seems like a bit of a pointless and unanswerable argument about semantics, the only bit is the irritating "ohh if it's SOOO EASY" about something that was definitely framed not to be easy.
If your cutoff of "world class mathematician" is a few hundred or thousand people, then no chance. If their cutoff is "earn a comfortable living" and the top 10% of the world is 800,000,000 people most of whom don't study mathematics, then can an average intellect with an obsession for math end up working a job a normal person might call 'mathematician' by working on AutoCAD or 3D rendering game engine or industrial statistics and process control or economics or vehicle aerodynamics and be in the top 10% of the world in mathematical ability? Possibly yes. And you can adjust the numbers and criterion to get a yes or no whichever way you like.
A mathematician is someone who creates or advances math. Not someone who uses math. If you don't understand how the word is used, that's your problem, not a problem with the statement.
mathematician /măth″ə-mə-tĭsh′ən/
A person skilled or learned in mathematics.
One versed in mathematics.
An expert on mathematics.
The American Heritage® Dictionary of the English Language, 5th Edition • More at Wordnik
None of the definitions require creating or advancing mathematics.
>you can adjust the numbers and criterion to get a yes or no whichever way you like.
Good idea, I'll do that :)
>can an average intellect with an obsession for math end up
>working on AutoCAD or 3D rendering game engine or industrial statistics and process control or economics or vehicle aerodynamics and be in the top 10% of the world in mathematical ability?
I think this does happen quite a bit and the need for strong math in these difficult areas is so great that there will never be enough people as briliant as Tao to fill the positions.
That's so far outside the mainstream anyway, most systems are going to screen the rare person like that out without understanding why.
Now what happens when those having top 10% of ability are very excellent themselves, but cases come up that would yield only to a Tao level of "natural-born" problem-solver?
> absolutely zero chance of the 50th percentile IQ becoming a world class mathematician.
Good, we don't need billions of them anyway.
I wish modern society would quit focusing on individual intelligence over collective intelligence. We can take something like the microprocessor, for example. The smart group that designed the microprocessor was not the same group that designed the software nor the group the built the parts nor the group the assembled the device. However, every group is equally important.
"Dedication and work ethic" is almost certainly nonsense here. Some people do the activity a lot without having any ethic or dedication - they like it.
>> people generally don't like to listen to stories that suggest that nurture and hard work aren't as important as they presume
Having grown up in australia but living in the us, this attitude is very american. It's quite funny to see when you don't grow up thinking it. I married into a very athletic family and have a child who is a precocious athlete. Many parents ask us what training regime or practice sessions we do. The answer is nothing. People don't react well.
If you work at it you can get better. It's true in both cases. Getting better and getting as good as the very best are two very different things. One is about you. The other is mostly about others.
I'm sure company policy would technically prohibit them from accessing company resources from their personal computer; or if it does allow access to company resources from their personal computer then their corporate tech policy very likely does apply to their personal computing.
If the executive bought it for a personal mac mini for personal use only, with no interaction with company resources, then the person probably wouldn't have told the story.
And yet you mixed strong advice with childish dismissal.
But I agree, you should present your best argument, otherwise they'll attack your weakest argument and claim victory. Such tactics are weak in logic but often successful.
Of course you are right, but in addition they wouldn't have even made them if GPUs hadn't made ML on CPU so relatively incapable. Competition drives a lot of these decisions, not just raw performance.
I'm not missing the point. If you recall your computer architecture class there are many vector processing architectures out there. Long before there was nvidia the world's largest and most expensive computers were vector processors. It's inaccurate to say "gaming built SIMD".
You are missing the point - it's an economic point. Very little R&D was put into said processors. The scale wasn't there. The software stack wasn't there (because the scale wasn't there).
No one is suggesting gaming chips were the first time someone thought of such an architecture or built a chip with such an architecture. They are suggesting the gaming industry produced the required scale to actually do all the work which lead to that hardware and software being really good, and useful for other purposes. In chip world, scale matters a lot.
The Cray-1, which produced half a billion USD in revenue in today's dollars, at a time when computing was still science fiction, did not demonstrate scale? I just can't take you in good faith because there has never been a time when large scale SIMD computing was not advanced by commercial interests.
In this context scale = enough units/revenue to spread fixed costs.
I'll take your word on lifetime revenue numbers for Cray 1.
So yes, in todays dollars, $500 million of lifetime revenue - maybe 60-70 million per year, todays dollars - is not even close to the scale we are seeing today. Even 10 years ago Nvidia was doing ~$5 billion per year (almost 100x your number) and AMD a few bill(another 60-70x ish)
Even if you meant $500m in annual (instead of lifetime), Nvidia was 10x that in 2015. And AMDs GPU revenue which was a few billion that year, so it's more like 17x.
That's a large difference in scale. At the low end 17x and at the high end 170x. Gaming drove that scale. Gaming drove Nvidia to have enough to spend on CUDA. Gaming drove NVidia to have enough to produce chip designs optimized for other types of workloads. CUDA enabled ML work that wasn't possible before. That drove Google to realize they needed to move away from ML on CPU if they wanted to be competitive.
You don't need any faith, just understand the history and how competition drives behavior.
There is also the benefit of being able to use a single database (and hence schema) across multiple "apps". In many cases the complexity arises from the fact that all these apps have their own databases.
And the big advantage for us is two things: Our content marketers now have a "Cursor-light" experience when creating landingpages, as this is a "text-to-landingpage" LLM-powered tool with a chat interface from their point of view; no fumbling around in the Webflow WYSIWYG interface anymore.
And from the software engineering department's point of view, the results of the work done by the content marketers are simply changes/PR in a git repository, which we can work on in the IDE of our choice — again, no fumbling around in the Webflow WYSIWYG interface anymore.
reply