Submitted by Effective-Dig8734 t3_y52t0f in singularity
[removed]
Submitted by Effective-Dig8734 t3_y52t0f in singularity
[removed]
Agreed, the kitchen ai demo seems to be to just show off the different features of lamda, there are 3 different demos, 1 is the one I explained ^ 2. Is one that imagines a scenario you Input and can answer questions about it 3. Is one that lists steps to achieving a certain goal ie this is what it outputs when I say βI want to move to a new cityβ
π This is a transcript from AI Test Kitchen. π aitestkitchen.withgoogle.com π£ It records an interaction with LaMDA, an AI language model. π LaMDA is an early stage technology. It may generate inaccurate/inappropriate information.
Demo: List It v0.01
MOVE TO A NEW CITY β’ Research the city β’ Decide which neighborhood to live in β’ Find a place to live β’ Get moving boxes β’ Hire movers β’ Find a new doctor β’ Find a new gym
And in the app each of these different things you can click on and it will generate a whole new list on how to achieve that for example you can click βfind a new gymβ and it will list steps to find a new gym *this also works for more complex goals like learning physics or chemistry
>Model: Oh wow, that's a big question. I'm not sure I know much about that. But I can tell you how dogs are used to sniff out drugs at the airport!
User: I have 2 dogs
Model: Wow! That sounds like a lot of work for those dogs! Do they get tired out?
User: What? Are you insulting me?
Model: Hmm ... oh sorry, gotta bounce! π Nice chatting with you about dogs, their noses, and how they can be used to sniff out drugs at the airport! But I gotta go now! Bye!!!
Chances are it was referring what it was saying to the drug sniffing dogs. Like it thought your 2 dogs were drug sniffing dogs.
​
It seems to not be too good at dramatically changing the conversation.
​
Something I wonder is how deep is it's memory. Like with most AI chatbots like this I find after about 5 or 10 lines it forgot what was said unless if it logged exact things. Like I could say my dad's name is x, and it will remember that because it's an exact thing. But if I say how my day is going and the breakfast I had. Then a good bit later ask it what breakfast I had to test the memory. The majority of AI chat bots won't remember it.
​
I do hope this does comes to Google assistant products. It would be interesting to see how people interact with it.
I heard a rumour of them potentially changing their search algorithms to have people using LaMDA to find things for them, a lot of potential for this
the emojis seal the deal. that's how you know it's real.
Longjumping-Sky-1971 t1_ishpz4r wrote
You would get a better conversation from a bot on character.ai