@manlycoffee Im writing a claude code replacement that will be able to do deep research among many other powerful new things. Ill share it once its out.

@yura15cbx I'm looking now and I dont see any emails from you at the address I gave you. What email did you send it from?

@yura15cbx

Did you send out that email? I didnt see it. We could have probably fix this for you in a few minutes depending on the root cause.

@yura15cbx

Do me a favor, email our system admin witht he problem, and include screen shots. michael.griffith@cleverthis.com and CC me jeffrey.freeman@cleverthis.com

I will get him to address this with you and help out.

@yura15cbx

Yea we should be able to get you set up. did you createt he destination account yet or is that the confirmation email you are talking about?

I'm very proud that the Netherlands is boycotting Eurovision Songfestival next year due to Israel's participation. 🇳🇱🔥 #TheNetherlands #eurovision2026

dutchnews.nl/2025/12/dutch-pul

@manlycoffee Ahh ok that makes more sense, yes EAE can be helpful in figuring out what agent should do what, among other things. Fine-tuning can improve EAE in specialized cases (not strictly needed in general) and that can lead to situations where MCP/skills will be better utilized (by ensuring the proper agent is using the proper skill).

Though to be clear, fine-tuning is neither needed nor routinely used for this purpose. Not to say you cant, or that it wont help, just, usually a powerful generalized model is good enough for most things in that regard.

Personally I use MCP quite extensively in anything I do with LLMs and cant think of a case of an LLM that used skills that I had to fine-tune to improve its skill use.

@manlycoffee

Im confused what you mean here. Ive been workin gin the AI 20 years, not saying your wrong, just trying to understand what your saying...

MCP/skills are just any mechanism the LLM has to interact with the world. NER is the ability to classify one "thing" as another. "Jeff Freeman is a scientist" for example.

While NER is very important in NLP for sure,and yes we can fine-tune encoder-only llms to be better at NER I still am not grasping why your lumping fine-tunning + NER in with MCP, MCP is largely unrelated to NER and fine tuning, am I missing some context or something?

@manlycoffee

hmm interesting take. Maybe im disconnected with the common perspective, but i wont why people associated MCP with fine-tuning, they arent even related.

@manlycoffee Well yea there is something special about it, it gives the LLM's a standard way to interact with the world. Thats proven to be pretty powerful

@frayoshi I think the new server were working on has that support. I will check.

@fisunov

We are starting the code base over, but we will be keeping the content and posts (hiopefully). I have our hired team working on it.

@SecondJon

@mapto Thanks,I agree, the features could always, in theory, be added back to glitch, though many will be covered by glitch as well as it has some of our features.

@tripu We have a way we can copy your data to the new server, from there you could export it if you wish if we go with the Glitch solution (which is maintained and fairly mainstream).

@mapto yea it was a bug in the web interface. We are discussing two options. fix the current bugs, but there will remain lag in updates. Or we can switch to glitch and get more regular updates but loose a few minor unique features. Thoughts?

@tripu Ok spoke witht he team. going to talk to the whole group but we need to decide if want to keep maintaining the current fork or move over to Glitch and keep updates more regular. Thoughts?

Show older
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.