Recent comments in /f/MachineLearning

WarmSignificance1 t1_jdeqxg8 wrote

Reply to comment by race2tb in [N] ChatGPT plugins by Singularian2501

Seems like trying to fit a square peg. Why would you want to do this instead of having a static website?

If we’re talking about dynamic websites that’s a whole different ballgame, and LLMs seem even less appropriate for them.

4

iamspro t1_jdeq8jw wrote

Reply to comment by nightofgrim in [N] ChatGPT plugins by Singularian2501

Awesome I did the same, plus a step to send those commands to the home assistant API. Then with Shortcuts I added a way to send the arbitrary sentence from Siri to this server. Still a bit awkward though because you have to say something like "hey siri tell gpt to turn off the kitchen light"

9

sebzim4500 t1_jdeq6uo wrote

There may have been pretraining in how to use tools in general, but there is no pretraining about how to use any third party tool in particular. You just write a short description of the endpoints and it gets included in the prompt.

The fact that this apparently works so well is incredible, probably the most impressed I've been with any developement since the original ChatGPT release (which feels like a decade ago now)

12

ghostfaceschiller t1_jdenoo2 wrote

Reply to comment by BigDoooer in [N] ChatGPT plugins by Singularian2501

Here's a standalone product which is a chatbot with a memory. But look at LangChain for several ways to implement the same thing.

The basic idea is: periodically feed your conversation history to the embeddings API and save the embeddings to a local vectorstore, which is the "long-term memory". Then, any time you send a message or question to the bot, first send that message to embeddings API (super cheap and fast), run a local comparison, and prepend any relevant contextual info ("memories") to your prompt as it gets sent to the bot.

14

RedditLovingSun t1_jdemr0b wrote

Reply to comment by nightofgrim in [N] ChatGPT plugins by Singularian2501

That's awesome I've been thinking of trying something similar with a raspberry pi with various inputs and outputs but am having trouble thinking of practical functions it could provide. Question, how did you hook the model to the smart home devices, did program your own apis that chatgpt could use?

3

Nyanraltotlapun t1_jdeltnm wrote

Long story short, main property of complex systems is the ability to pretend and mimic. So the real safety of AI lies in its physical limitations (compute power algos etc.) the same limitations that makes them less useful less capable. So the more powerful AI is the less safe it is. There more danger it poses. And it is dangerous alright. More dangerous than nuclear weapons is.

1

ghostfaceschiller t1_jdekpke wrote

Reply to comment by psdwizzard in [N] ChatGPT plugins by Singularian2501

Trivially easy to build using the embeddings api, already a bunch of 3rd party tools that give you this. I’d be surprised if it doesn’t exist as one of the default tools within a week of the initial rollout.

EDIT: OK yeah it does already exist a part of the initial rollout - https://github.com/openai/chatgpt-retrieval-plugin#memory-feature

14

sleeplessinseattle00 t1_jdeir9w wrote

Haven't received any response yet. I've answered all of the raised clarification and extra experimentation, but at this point, I really can't do much if they're just gonna ghost me.

PS: I am also a reviewer and I've finished responding to all of the papers assigned in my pool.

4