For me (actually trying to get shit done using this stuff) it's validation.
Being able to have a verifiable input/output structure is key. I suppose you can do that with a regular http api call (json) but where do you document the openapi/schema stuff? Oh yeah...something like mcp.
I agree that mcp isn't as refined as it should be, but when used properly it's better than having it burn thru tokens by scraping around web content.
Yup, routing is key. Just like how we've had RAG so we don't have to add every biz doc to the context.
I agree with the general idea that models are better trained to use popular cli tools like directory navigation etc, but outside of ls and ps etc the difference isn't really there, new clis are just as confusing to the model as new mcps.
People were in fact holding it wrong to get the signal to attenuate. The way you had to grip the phone to affect signal was not practical in any way. That controversy was entirely bullshit and only Apple would have ever been dragged for it.
I have a macbook and a Samsung. The products I buy are here to do a job for me and that's it. If another brand does a better job then I'll switch to that with no qualms.
Except that most READMEs are seemingly written more for end-users than for developers; and even CONTRIBUTING files often mostly just document the social contribution process + guidelines rather than providing any guidance targeted toward those who would contribute. There’s a lot of “top-level architectural assumptions” detail in particular that is left on the floor, documented nowhere. Which “works” when you expect human devs to “stare really hard and ask questions” until they figure out what’s being done differently in this codebase; but doesn’t work at all when an LLM with zero permanent learning capability gets involved.
Being able to have a verifiable input/output structure is key. I suppose you can do that with a regular http api call (json) but where do you document the openapi/schema stuff? Oh yeah...something like mcp.
I agree that mcp isn't as refined as it should be, but when used properly it's better than having it burn thru tokens by scraping around web content.
reply