Hacker Newsnew | past | comments | ask | show | jobs | submit | karmakaze's commentslogin

> The first 90 percent of an AI coding project comes in fast and amazes you. The last 10 percent involves tediously filling in the details through back-and-forth trial-and-error conversation with the agent. Tasks that require deeper insight or understanding than what the agent can provide still require humans to make the connections and guide it in the right direction.

So then why not at this point switch to the human being primary author and only have the AI do reviews and touch ups? Or are we restricting ourselves to vibe coding only?


> The last 10 percent involves tediously filling in the details through back-and-forth trial-and-error conversation with the agent

It so often sounds like "traditional coding" flows like an orchestra during an opera while vibe and 'agentic' coding flows like a bunch of big bands practicing.

Are they trying to tell the story that "it's the same" or that "it's just not the same"? Is the toolchain changing that much that there is no reason to learn the baseline anymore? So the next ten years of AI development should be left to those who already weild the basic tools? Just like the economy? Is the narrative meant to establish a singularity-driven relationship with young coders, computer scientists and those who use code to entertain, inform and sell via media? While simultaneously pushing the outliers to the edge of the sphere and lock them out via their lack of AI skills and experience with such tools from ever reaching a proper chunk of the mob?

On the one hand, it's a personal decision. Trends and narratives can be convincing. Defactors are rare nowadays while polarization and the status quo are the defacto standard. So on the other hand, it's a depersonalized decision reinforcing the hierarchies (hardware) that dictate which tools (hardware, the cloud) dominate the main stream either way.

> Or are we restricting ourselves to vibe coding only? > why not at this point switch to the human being primary author

It's the only choice. You are either the primary author of the code or of the learning material. In the former case, the latter is implied and you can't teach if you don't know.

In essence, all this "AI hype" should really only motivate. But these perceptions of "the end of stuff as we know it" and "NOW it's definitely not in my/our hands anymore" that is everywhere weighs heavy. So that the only "residue outcome" really is: making money is the only thing that's left ..., again, reinforcing the hierarchies (hardware) that dictate which tools (hardware, the cloud) dominate the main stream either way--whether you break under the weight or not, whether you shrugg it off or become versed enough to just carry it along--while establishing a singularity-driven relationship of the system with it's constituents.

This is the way.


Not much info than being a 31B model. Here's info on GLM-4.7[0] in general.

I suppose Flash is merely a distillation of that. Filed under mildly interesting for now.

[0] https://z.ai/blog/glm-4.7


How interesting it is depends purely on your use-case. For me this is the perfect size for running fine-tuning experiments.

A3.9B MoE apparently

Makes sense though, having the general population aware of the rest of the world is challenging for governing.

The news here though is that Canada is now rest of the world rather than close trading partner.


OT: ...protecting their privacy. LOL (wrong playbook triggered writing this)

> [...] to make Apple Creator Studio an exciting subscription suite to empower creators of all disciplines while protecting their privacy.


Fun read. Yet another solution is to only do gc of the stack, a la "Cheney on the MTA" used by CHICKEN Scheme at one time. [No relation to Dave Cheney of the article.]

Exactly. We should all speak/write Esperanto.

In the meantime, I'll advocate for evolving Markdown, specifically GFM.

I do use a custom keyboard layout, but that doesn't have to interoperate with anything/anyone.

Edit: I have a similar unpopular opinion: we should use functional languages and immutable datastructures. At least we some data of movement in that direction with the patterns being adopted by other languages and codebases.


This is demonstrating the difference that still exists between free software and open source. People who use GPL/AGPL want it to be shared and spread vs others using source as means to an end other than more/better software. There is still a problem with AI/LLMs though that basically de-GPL such licensed source.

> (Relativity isn't a problem here, though it is a tempting distraction. All of humanity's current computing systems share a close enough frame of reference to make relativistic differences in the perception of time immaterial).

GPS clocks take special and general relativity into account. I'm no expert, just something I thought I'd read.

I'd never heard of the FLP result before and it seems mostly a theoretical concern with distributed systems which do have time bounds etc so it doesn't apply (unlike CAP which always does).

Edit: I like the details that are presented, but not the way it's done. If organized as reference material could be more useful for this volume of info. As long as we're digging in the weeds, I'd like to hear about how there is no absolute/universal "same time" for spatially separated events.


Thanks for the thoughtful comment!

I think there’s a small misunderstanding, though: while systems like GPS do account for relativistic effects at the clock level (and even with extremely precise atomic clocks and practical synchronization via NTP), this doesn’t mean we have a universal or perfectly shared notion of “the same time” across distributed nodes — especially once you consider network delays, clock drift, and faulty or unreachable nodes

In physics there is no absolute global time for spatially separated events, and in distributed systems this shows up as unavoidable uncertainty in synchronization.

Also, the FLP result isn’t about relativity or physical clocks at all — it’s a theoretical result about the impossibility of guaranteed consensus in fully asynchronous systems with failures.

So even with very accurate clocks and practical time bounds, distributed algorithms still have to explicitly deal with uncertainty and partial synchrony rather than assuming perfect global time.


How do they come up with the fine amount?

> The fine represents 1% of the company’s global revenue, where the law allows for a maximum of 2%.

It seems to me that they should be fining only Italy's portion of their revenue (or at most EU if backing the move).

Even better, they should have to provide data to support the amount of revenue that was lost from lack of compliance.


Having both modes could be good. Default allows pre-noodling and enabled gives satisfaction of being correct one-shot (or falls back to retry with noodle).

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: