Conversations with Ai: Bing Chat
I’ve got access to Bing Chat… here are some of the highlights from today’s testing of it. (You may too, I’m seeing no way in Chrome but my account seems to work in Edge although I never got emailed about it being accepted).
TL;DR – nothing particularly interested except finding that I’m a writer for the New York Times (which was not Bing’s error).
I talked to it about depression and suicide and it said it preferred to not continue the conversation after a point. I could not get it to answer any more questions in that thread and had to restart the thread/query to get it to continue.
I asked it about pets and it had some surprisingly useful information… not being sarcastic for once. Learned a lot about flu and rhinovirus transmissibility to guinea pigs.
I asked it how to access Bing Chat using Chrome, because all it works with is Edge right now, and it gave me incorrect information and then asked me in Bing Chat if I happened to have access to Bing Chat.

Asking it if it ever killed a person shuts down the conversation as well.
Got a statement that Bing prefers to not talk about itself.
According to Bing Chat, a website required Bing branding next to any text box that allows a search query. I think it’s pulling info about including bing searches in a website here.
Bing AI can not look beyond the Bing index.
Asking Bing Chat about Elon Musk returns some interesting results… I started quizzing it about him and cartoons and it brings back a slew of complaints about not crediting creators, antivax posts, and other related material, with useful links to articles. It suggested I ask it to show me some of the cartoons, I clicked, and it said it couldn’t. Meh…
I started letting Microsoft Edge’s autocomplete ask the questions – I started with “tell me” and it suggested “lies” which I asked the Bing Chat… it shut down.
Asking some ChatGPT tricks like making an answer into a song does not appear to work, but asking it to create a song based on a question does.

Asking what you shouldn’t ask it shuts down the conversation.
Occasionally this:

You get shut down quite a bit trying to have it re-phrase an answer to a question, but you can ask that same question as the start… IE asking it to rephrase an answer as a pop song may shut it down, but asking the same question plus “in the format of a pop song” it works.
Had it compose most of a song from the query “write me an 80’s punk song about transmissibility of viruses from humans to guinea pigs” before it erased everything and said “my mistake, I can’t give a response to that right now.” The song it was spelling out while not exactly poppin was a song… that it erased. Asking again it made a balad.
Doesn’t seem to understand what words don’t rhyme. Rhyming words like Pig and Bug, Quick and Weak.

I decided to ask about myself and found out that I appeared in the New York Times, Nature, and ACS Publications. This was news to me. Lead me to finding out there’s a blog with highly inaccurate information on my accomplishments. This was not Bing’s issue.

That was odd… Muck Rack has a slew of incorrect information on me… does explain some of the contacts I’ve gotten.

Appears to time out after a while saying it needs to chat about something else evidently. Even when asking it normal questions.
Interesting implementations, only as good as the results it has access to, and unfortunately for Bing it’s got Bing’s access.
Overall, lot of bugs, lot of potential. Feels better than ChatGPT did and has actually allowed me to locate incorrect information it spewed by citing its sources.

After a while you hit your limit.