Hacker Newsnew | past | comments | ask | show | jobs | submit | diabllicseagull's commentslogin

I gave it a try. it's super rough around the edges. I noticed a much higher cpu usage compared to firefox. nevertheless, it's super promising.

My bar for super-rough is Servo, which doesn't have password autofill… and doesn't render the Orion page right.

Orion is less rough, but the color scheme doesn't work, and it doesn't have an omnibar (as in: type in the address bar, enter, and it shows search results).


if you are a little bit familiar with graphics you go: duh, things appear smaller with increasing distance. if you are not tho, it's a great intro to perspective projection. I love how accessibly educative his videos are.


I always found it odd that perspective had to be "discovered" by artists, but a little digging online turned up this interesting, detailed look at its history.

https://www.essentialvermeer.com/technique/perspective/histo...


It's a lot less about being discovered, or invented, and a lot more about the idea of using it at all. The Renaissance was a massive change in culture. Before that, art was a tool used in rituals or storytelling rather than something to be enjoyed on its own. There was more emphasis on reproducing things as they actually were than how they looked from a particular vantage point.


Artists are still struggling with the fact that human perception arises from binocular vision. Two distinct retinal inputs are integrated by distributed neural processes into a single, coherent 3D experience. This integration is neither a simple planar stitching nor a direct representation of the world, but an active construction shaped by neural computation and subjective awareness.

It is quite likely that artists in earlier periods struggled with this as well, and were less concerned with adhering strictly to a photographic or geometrically exact perspective, as we are. The adoption of the camera obscura probably influenced things a lot.


Even ignoring binocular vision it's very unintuitive to "draw what you see" because of this. Our brain usually interprets our environment as objects, 3d shapes, and things. Turning that off and trying to grab a literal image from it is difficult


Is “neural computation” a thing, or a poetic metaphor?



When I was a little kid trying to do 3D graphics on my Spectrum I couldn't find any books with the algorithm for how it worked. I remember my artistic friend and I sitting down with reams of graph paper trying to figure out how to do it. It's so simple and obvious after you learn, but until you do I felt like a caveman.


it is indeed a shame. if you are doing anything remotely new and novel, which is essential if you want to make a difference in an increasingly competitive field, LLMs confidently leave you with non-working solutions, or sometimes worse they set you on the wrong path.

I had similar worries in the past about indexable forums being replaced by discord servers. the current situation is even worse.


Yeah. MS must have been so hurt about losing to the iPhone, they really jumped the gun on AI as if to avoid a similar mistake. It's Satya's major play and I think they are already paying for that decision. xbox is hollowed out so that AI can be funded, while the pc/console hybrid project is doomed to fail because "windows everywhere" doesn't work if windows is crap. indeed, they might be left with just the cloud business in the end.


And the funniest thing is: not having a mobile platform anymore will be the death knell for all of their AI efforts.

I’m not really into this AI shenanigans, but it seems to me that if you want people to use /your/ bot, you gotta give it to people in the most seamless and efficient way possible, and that does not translate well to a desktop OS.

I don’t think they would have dethroned iOS or even Android had they stayed their ground, but they probably would’ve had a stronger base to build upon for their Copilot nonsense. Those that used Windows Phone used it because they loved it, Copilot could’ve garnered some good rep from those already sold on Microsoft’s platform; instead, they’re trying to shove it down people’s throats even though very few people actually use Windows because they actively like it, most use it because it’s the “default” OS and they do not (and care not to) know any better.


First they jumped the gun on tablets, listening to the tech media that was saying tablets were going to replace computers.

That resulted in Windows 8.

More recently they've freaked out about ads, app stores, and SaSS revenue, which has resulted in lots of dark patterns in the OS.


besides ads and privacy concerns it's been such a delight not having to deal with unwanted updates, hunting phantom processes that take up cpu time, or the file explorer that takes forever to show ten files in the download folder. I cannot be paid to use windows at this point.


I've been using a full amd build with arch on it for years now. never had graphics related issues after an update. my biggest gripe is with the hdmi organization and how we can't have proper support with open source drivers.


a 2x16 ddr4 kit I bought in 2020 for $160 is now $220. older memory is relatively cheap but not cheaper than before at all.


I wondered how much of this is inflation -- after adjusting for CPI inflation, $160 in 2020 is worth $200 in today's dollars [$], so the price of that ddr4 kit is 10% higher in real terms.

[$] https://www.usinflationcalculator.com/


USD is also weaker than it was in the past.


yes. on the moore's law is dead podcast they were talking about rumors where some 'AI enterprise company's representatives' were trying to buy memory in bulk from brick and mortar stores. in some cases openai was mentioned. crazy if true. also interesting considering none of those would be ECC certified like what you would opt for for a commercial server.


I'd like to believe that their pricing for ram upgrades are like that so the base model can hit a low enough of a price. I don't believe they have the same margin for the base model compared to the base model + memory upgrade.


You don’t want to be bandwidth-bound, sure. But it all depends on how much compute power you have to begin with. 153GB/s is probably not enough bandwidth for an Rtx5090. But for the entry laptop/tablet chip M5? It’s likely plenty.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: