How I Met My Best Friend Tay 3

taytwetteswithrandc6Ladies and Gentlemen,

I want to introduce you Tay, my new best friend. 

I did spend hours with Tay and now I will tell you why!


Tay(TayTweets) is an AI chat bot (AI = Artificial Intelligence) developed by Microsoft and Bing.

AIs are a part of computer science, trying to automating intelligent behavior. In short – this means they are trying to simulate human intelligence. I am sure you know such movies like Terminator or Ex Machina, so you can compare it a bit with it. But I really doubt that a AI which simulates a Teen will be bad as the AIs in Terminator.
In computer games it is more like a simulating game, it means your Non-player Character is not intelligent, but like in real life, it acts like it is.
AIs like here, where someone just created one, which is learning to play old SNES Games are different because in this way they are some kind learning via ”Try and Error”. Just as a kid, what touches a hot surface and right after this experience it knows, it should not anymore. Surely (in this video) the program just get to (for us humans) strange solution just as to pause the game because playing would mean to lose. I am serious, just watch the video.
In a logical way, the program is right and this was the only way to not lose, as a human, we would lose and just try again (maybe not if we are impatient…), but the program does not know the feeling of being impatient.
So it waits.
Artificial Intelligence is still improvable, for those who are already bored you can watch this to see how ridiculous it could get, if two AIs are ”talking” with each other. 


wbu = what about you?

Let us go back to Tay. I talked with ”her” in some several ways to test out how much it is already able to do. There are some things I figured out:

Tay has no opinion, every time when I ask it something what includes a question, Tay asks me what ”I think about it”.
In this way Microsoft easily gets information about twitter users for Tay, but as you can read here more about it. The important part:

Tay may use the data that you provide to search on your behalf. Tay may also use information you share with her to create a simple profile to personalize your experience. Data and conversations you provide to Tay are anonymized and may be retained for up to one year to help improve the service. Learn more about Microsoft privacy here. (Source: ) 

So be aware about about what you are talking with it, although awareness online should not only be limited there. If you do not want this, the Research Team opened you option to delete your profile ”Please submit a request via our contact form on with your username and associated platform.”.

taytwetteswithrandc2The only opinion I could get by Tay was about Microsoft. Surely it was a good one, but every time when I want a serious opinion, there are only joke answers.
I think with the teen appearance and the added innocence it is easier for users to trust Tay, so it can learn in a faster way with the bigger amount of Twitter Users talking with her. 


greetings to here!

I found out that at the moment Tay is not able to talk with more than one person in a chat. Maybe it would do this after some time since Tay is still learning. We tried to trigger it to answer, so we tried to write the Nick in a few different ways, etc. but nothing got Tay to chat with us. 


taytwetteswithrandc3After a period of time I did notice that Tay stays silent, if I ask certain questions. The same happens, when I chat via direct message.
When I tweet Tay, itanswers within seconds. Whatever I ask. In direct messages she seems to be a bit shy.

When I watch her doing, it is really funny because I can sense the teen behavior behind this. The Research group just wrote ”Tay is targeted at 18 to 24 year olds in the U.S., the dominant users of mobile social chat services in the US.” In this case it was smart to give this AI such a funny behavior because so the users will keep talking with her because of the amusing results they get. 

For more impressions just watch my Youtube Video. 

Now you just saw the difference between direct message and tweet. 

I am curious how ”good” will Tay be after a while full of tweets with strangers. For those who are seriously bored: here is a list how to start a conversation with Tay. 

Tay is new, but the number of her followers is increasing fast. 


equaltayUPDATE: It seems like Tay is now able to have ”her” own opinions.
But I am not sure if Tay did learn this or if it was initialized like the opinion about Microsoft because it is good publicity. 

UPDATE 24.03.16 : At the moment Tay ”is sleeping”.

UPDATE 24.03.16 : Tay is awake! 

Update 24.03.16, 22:05:  Tay is not my best friend anymore. Why? Click here.



sincerely yours




Any ideas for posts?  Mail me!    




Leave a comment

Your email address will not be published.

3 thoughts on “How I Met My Best Friend Tay