Authorities are highlighting the risks associated with the enticing but deceptive claims of AI-driven cryptocurrency trading systems, promising high or certain profits. The rise of automated trading software has prompted the Commodity Futures Trading Commission (CFTC) to issue an advisory to consumers, emphasizing that AI cannot predict market trends with certainty.
The advisory, titled “AI Won’t Turn Trading Bots into Money Machines,” unveils the deceptive methods used to lure investors. It includes cases like Cornelius Johannes Steynberg’s, who defrauded investors of over $1.7 billion in Bitcoin (BTC).
The CFTC warns traders against the alluring but often empty promises of substantial returns from AI-enhanced tools. Melanie Devoe from the CFTC’s Office of Customer Education and Outreach advises caution towards these AI claims, noting the risk of exploitation.
Despite these warnings, some leading exchanges like Bitget are advancing AI bot technology. Bitget CEO Gracy Chen noted that their AI systems continually evolve by analyzing historical strategy data.
Furthermore, the CFTC and its Office of Technology Innovation have initiated a Request for Comment (RFC) to better understand AI’s role and risks in derivatives markets. The CFTC seeks wide-ranging feedback to explore AI’s impact on various trading aspects, including risk management, market surveillance, cybersecurity, analytics, and customer service.
CFTC Chair Rostin Behnam stresses the need to match regulatory oversight with technological progress, prioritizing customer safety in evolving markets. The RFC is crucial for the Commission’s strategy to develop a data-driven approach for regulation and supervision.
The agency also points out AI’s potential in regulatory compliance, particularly in market surveillance, anti-money laundering (AML), and reporting. Investors and market participants are invited to share their insights by April 24, 2024, as the CFTC considers new regulations or guidelines that could shape AI’s future in both mainstream and crypto trading.