Since a chatbot called ChatGPT was released late last year ~ and took the world by storm ~ the central question has been: Have we truly entered the age of Artificial Intelligence (AI)? The technology has caused excitement and controversy in equal measure because it is one of the first models that can convincingly converse with its users in English and other languages on a wide range of topics. With its eerie human-like language and coding skills, ChatGPT marks a truly remarkable step notwithstanding the inherent moral biases and other rough edges. This technology has far-reaching consequences for science and society. Researchers and others have already used ChatGPT and other large language models to write essays and talks, summarise literature, draft and improve papers, as well as identify research gaps and write computer code, including statistical analyses. Soon this technology will evolve to the point that it can design experiments, write and complete manuscripts, conduct peer reviews and support editorial decisions to accept or reject manuscripts. Experts believe that conversational AI is likely to revolutionise research practices and publishing, creating both opportunities and concerns. So, how good is it? Are we witnessing the rise of a new AI? Has OpenAI caught some of the technology giants napping? The brand wars between technology giants are being reprised. In late 2022, Microsoft-backed AI chatbot called ChatGPT was unveiled. It became the fastest online platform to reach 100 million monthly users in just two months. For the purpose of comparison, TikTok took 9 months to reach that figure. From poetry to history lessons to coding, it can do anything. This time it is Microsoft versus Google. As has been pointed out by many, the foot soldiers are AI chatbots driven by the search engines themselves. Recently, Google unveiled Bard, its own AI chatbot, powered by its Language Model for Dialogue Applications, “to trusted testers” ~ with plans of releasing to the general public shortly afterwards. While we wait for the battles between these tech behemoths to unfold, let’s not forget that Artificial Intelligence has both benefits and drawbacks. AI can automate repetitive tasks, freeing up time and increasing overall efficiency. It can make more accurate predictions and decisions compared to humans, reducing the margin of error. AI-powered chatbots can provide instant, 24×7 customer service, improving the customer experience. AI also can assist doctors and researchers in the development of new treatments and cures. These are some of the positives ~ a collection of information that can be sourced easily. But this technology comes with its share of drawbacks. The automation of certain tasks through AI may result in job loss, particularly in industries such as manufacturing and customer service. AI algorithms can reflect the biases of their creators, leading to discriminatory outcomes. AI systems can be vulnerable to hacking, malware, and other security threats. AI often involves the collection and analysis of large amounts of personal data, raising privacy concerns. It can be difficult to determine who is responsible for mistakes made by AI systems. And the biggest fear of them all: it can kill the art of writing. On one hand, AI has the potential to automate certain writing tasks, such as the generation of news articles, summaries and reports, making them more efficient and accessible. On the other hand, some people are concerned that AI-generated content will replace human writers and reduce the demand for their skills. Additionally, AI-generated content may lack the nuance, creativity, and human touch that are inherent in writing produced by human authors. While AI can no doubt complement human writing and make certain tasks easier, it can never fully replace human writers and the unique perspective they bring to the craft of writing. As with all other advancements in technology, AI has the potential to bring about significant improvements in many fields, but it is extremely important to consider the potential drawbacks and take steps to mitigate them.