NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Quick Primer on MCP Using Ollama and LangChain (polarsparc.com)
minimaxir 6 days ago [-]
In the case of MCPs, this post is indeed a quick primer. But from a coding standpoint, and despite the marketing that Agent/MCP development simplifies generative LLM workflows, it’s a long coding mess that is hard to tell if it’s even worth it. It’s still the ReAct paradigm at a low level and if you couldn’t find a case for tools then, nothing has changed other than the Agent/MCP hype making things more confusing and giving more ammunition to AI detractors.
copperroof 6 days ago [-]
Yes, I read this post and was actually emotionally affected by a post about coding. I was surprised how sad I felt. I’ve been around for a long time but this truly feels like the best era if you like gluing trash to other trash and shipping it.
gsibble 6 days ago [-]
MCP is great for when you’re integrating tools locally into IDEs and such. It’s a terrible standard for building more robust applications with multi-user support. Security and authentication are completely lacking.

99% of people wouldn’t be able to find the API keys you need to feed into most MCP servers.

sunpazed 6 days ago [-]
While I’m a fan, we’re not using MCP for any production workloads for these very reasons.

Authentication, session management, etc, should be handled outside of the standard, and outside of the LLM flow entirely.

I recently mused on these here; https://github.com/sunpazed/agent-mcp/blob/master/mcp-what-i...

bswamina 6 days ago [-]
You are correct ... it is still early days IMHO ... will have to see how this evolves
lianlianlian 6 days ago [-]
[dead]
pkoird 6 days ago [-]
What, according to you, are some alternatives that exist or are in development that fill these gaps?
bongodongobob 6 days ago [-]
Is anyone really still using langchain? Has it gotten better? Seemed like a token burning platform the last time I used it.
jsemrau 6 days ago [-]
I recently finished a Langgraph class on Deeplearning.ai about a week after it came out. Already then the provided Notebook example didn't work and I needed to debug it to pass. I had great hopes on Langchain in 2024, but their product decisions toward LCEL and the complete lack of a discernible roadmap that does not constantly break things made me move away from them.
minimaxir 6 days ago [-]
I had to look up LCEL: https://python.langchain.com/docs/concepts/lcel/

What the heck? I have no idea what problem this is solving while not also creating new problems.

jsemrau 6 days ago [-]
The worst thing about LCEL is not that it's a different coding pattern. It also creates a major break withing the documentation that can't be fixed as you now have to factor into your search all documentation with and without LCEL.
pydry 6 days ago [-]
It's still absolutely fucking terrible - the mongo of the LLM world.
WD-42 6 days ago [-]
If you need to define and write the functions to calculate interest… what exactly is the llm bringing to the table here? I feel like I’m missing something.
minimaxir 6 days ago [-]
The LLM is what decides which endpoint/tool to call (or none at all) in response to the user input.

The original 2022 ReACT paper is still best explainer: https://arxiv.org/abs/2210.03629

mistrial9 6 days ago [-]
maybe some of these could be fit? https://python.langchain.com/docs/tutorials/
mettamage 6 days ago [-]
I think it's the case you don't need to but if you find it necessary. Basically you're augmenting LLMs with "normal computer power" just like a human.
gatienboquet 6 days ago [-]
You know it's going to be a great article when the design is from 1995
gclawes 6 days ago [-]
This website design is blessed. A great return to the past
entrop 6 days ago [-]
I'm just distracted by the "WALLA" in the penultimate paragraph.

It should be "Voilà", which is French for “there it is”.

the_arun 6 days ago [-]
Even name takes us back in time - SPARC
trebligdivad 6 days ago [-]
The units for the free memory are interestingly wrong; 'Executing shell command: free -m' The total system memory is 64222 bytes, with used (available) 8912 bytes.

which given that there seems to be no way to specify any data structure or typing in this MCP interface is hardly surprising!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 07:35:03 GMT+0000 (Coordinated Universal Time) with Vercel.