The fact that it allows them to put the pipes needed for all those items in the same place probably also helps. In a lot of places near me, the kitchen sink shares a wall with the bathroom because it's cheaper to do the plumbing that way. If the bathroom and kitchen were on opposite sides of the place, that means more work plumbing wise.
Plus, walls take up a lot of space. If we put the toilet in a different room, that would mean losing more space to walls (and configuration options). In my bathroom, the sink is in the front followed by the toilet followed by the shower. Walls (and the clearance that the doors would need) would mandate using a lot more space. Plus, you'd need more space for each bit not to feel claustrophobic. You don't feel like you're in a tiny box at the sink in my bathroom because it's open to the toilet and shower. If each section was only 3ft wide, it would feel really claustrophobic.
Jesus fucking christ. He fucking ran MD5 on some shit he pulled down from a web crawler.
Over-training a model on the validation set would be a lot more "brilliant", and even that is a dumb script kiddie level hack. Maybe finding an algorithm that computes weights s.t. the preimage of the training algorithm on the training set matches the result of training with truly random weights using the validation set. That could be a "clever hack". And even then _brilliant_ would be.... a real fucking stretch.
It's a pretty low bar for any dog that doesn't have major behavioral issues; I'd expect any dog with a non-dumbass owner and a good home from birth to be able to pass. Even many hard-core rescues could probably pass after a few years in a stable home.
A variant of this test for airline travel would probably not be too hard to design.
One of the problems with making any particular test mandatory is that the test operator can price gouge and then genuine service dogs will become even more expensive. So, such a rule would have to be paired with price controls on the test.
Without very careful contest design, the best performers are obviously going to be over-fitting. Especially if the entire distribution is public. That's exactly what this team did.
This is true of academic contests in general, btw, even without cheating. They stop being interesting/fun/good signals as soon as people start treating them as an independent skill set. Comparing performance on chess games for the first N games between two new players might be a good signal for some general intellectual capabilities. Comparing experienced players against one another is mostly just testing who's spent more time learning about chess.
In the past I've had a lot of issues with spammers and scammers using domain names that came from namecheap. namecheap made it difficult to report and resolve issues, spammer/scammers could create names much faster than they could be taken down. namecheap was not interested in tying one violation to another, they did not want to see that a few players were generating a lot of names that appeared to be from different people but all immediately started running the same spam/scam. If namecheap chooses to be a citadel for spammer/scammers then I'm sure not going to give them any of my money.
No, I don't expect perfection. However, I do expect very careful implementation of access management for very large databases containing lots of PII and other sensitive customer information. Things like huge databases being accessible without credentials shouldn't require perfection on the part of some human. That sort of stuff should be continuously audited in an automated fashion.
But the software industry is quite bad, as a whole, so even the relatively competent actors make surprising, high-impact mistakes.
Maybe it's because the stakes are relatively low (c.f., bridge collapsing vs. PII leak) and the competition relatively fierce? Maybe software engineering is still very young and moving quickly?
In any case, I think it's totally reasonable to hold the opinion that MSFT is doing things pretty well relative to the rest of the industry and also that the industry as a whole is doing a pretty poor job.
IDK, for me the story has to be one of the following:
1. MSFT made a huge and inexcusable mistake, so maybe there's something systemically wrong with MSFT; or,
2. MSFT is very competent, and even very competent people are making very big mistakes, so maybe there's something systemically wrong with the entire industry.
Architect here: from the outside looking in, you hit the nail on the head. In addition to The industry being so young the _relatively_ low-impact when bad things happen make things like this 'not a big deal'. When your mistakes result in a public outcry for a day, then fades into obscurity into the night, why change? why invest money into figuring out a better way?
When your mistake makes a building fall over...well, there's a reason why that almost never happens.
I don't think this is quite right. Most buildings don't get all their design parameters tested in reality. But say when there is an earthquake, and the building collapses and you find that various checks and balances in the design process went wrong. I know here in NZ where we have had a number of significant earthquakes all kinds of known and unknown things have been discovered about buildings, either ones that have ended up killing people or ones which now are condemned because things played out differently than the designers thought they would
Speaking about America, almost everything in a building beyond aesthetics is designed to a CODE MINIMUM. from the hangers that hang the ACT ceiling all the way, and especially to, the structural system. These systems have been designed and tested ad nauseam to provide minimum life safety standards. People in any industry can cut corners and screw up. Special situations can arise that surpass a minimum level standard (Fires started at every exit door, 9.0 earthquake...good luck) The forest you're missing through the trees here is the structured process that forces designers in a mature industry to design to a minimum agreed upon standard. Ironically, I'm highlighting the benefits of regulation...where it makes sense.
the forest I might be missing through the trees is that maybe there is an industry agreed upon standard within the Tech industry. My understanding is almost all of these breaches happen because comically silly mistakes (pw = password), not super high sophisticated attacks.
same with NZ, which has pretty strict codes as we are sitting at the junction of 3 tectonic plates. Regulation including inspection is great, and generally works great, but until you get an earthquake, you really don't know if all the checks and ticking of boxes actually did its job. Microsoft and others likely catch multiple problems through checks, but occassionally a perfect storm happens and things break down. You then adjust your "regulations" to cover any short comings (hopefully). The entire planet you are missing through the forest is that all buildings aren't constantly "penetration" tested to find where they have problems. A quick search shows that USA suffers from many live deployed buildings that have been shown that they don't meet compliance. By Engineers that should've known better....
I can respect the sentiment, and I agree your points are reasonable. And I think the problem actually stems from software development culture more than anything. Developers don't want the level of oversight you're suggesting. Many would make career decisions in order to avoid that kind of babysitting.
At the same time, the tech world is bigger than it used to be, the stakes are higher, and more is on the line than ever before. Mistakes are more costly (though in this particular case I don't think you could prove any real damages).
And worst of all, the political world remains incredibly tech-illiterate. So, those in charge of guiding us in this realm are ill-equipped to do so.
I don't have a good answer for this. In an ideal world I'd like businesses to take this sort of thing more seriously, but in reality I don't see any reason that they should.
Wow, I didn't realize the standards stuff is also done by volunteers.
> The only people paid at conferences are the registration people.
...if by paid you mean "free labor from local organizers and their students, plus some $1K-$3K travel awards for a PhD student in exchange for 20 hours of front desk registration, with maybe one association employee overseeing everything".
TBF, the travel awards are actually a pretty reasonable rate (avg $2K for 1/2 week of labor) if you close your eyes and ignore that:
1. CS PhD students are under-paid by a factor of 10x
2. CS PhD students are probably the only tech workers who are expected to raise funds on their own for work-related travel multiple times per year
3. these organizations double-dip. They are essentially paying a bit over market rate for labor required to run their conference registration desks while claiming that this relatively small amount of $$$ gives them some moral justification for price-gouging everything else (see: any ACM/IEEE statement about open access policy, which inevitably mentioned sponsoring student travel...)
4. ...and then still lean on conference organizers to find corporate sponsors to supplement those travel awards.
5. and those same students do a ton of other free labor in the form of writing and reviewing papers.
And so on.
Basically, any time I think "oh yeah, ACM/IEEE do that thing I didn't think about. Well, those people must be paid / must be receiving honest-to-goodness grants", it turns out: nope! All the work is done by volunteers, and any "grants" come with labor requirements.
TBF, none of that is totally unreasonable until you realize how much money these orgs are raking in. Where the hell does it all go!?
This is an excellent teaching resource. I really like these sorts of well-written, short-but-not-too-short deep dives on accessible topics. It's a great way to help students learn how to read math and build up confidence in their ability to learn on their own.
It's key to include false starts and promising paths that dead end. Learning to explore unknown solution spaces and refine the domain of the problem itself are critical skills in any open ended problem.
Spending an entire school lifetime being taught to "solve" "problems" and then being confronted with a world where problems aren't defined and solutions are ad-hoc and piecemeal is a rude awakening.
The strategies that made us a good students and made us feel good and smart in school aren't the same strategies that make for a good employee and those strategies set new devs up to fail when they can't "see the answer" to the current jira ticket they are tasked with.
It clearly did me in. It made me very sad that I could not write code to problems I have not seen before. I still haven't learned the mental skills necessary to face open problems with a curious mind. My first reaction to new problems is fear and anxiety to the extent I had to leave development entirely. I now do production support which is not what I intended to do. I wanted be a great developer.