AIs being able to convincingly pretend to know things isn't a sign of intelligence. Come back when we have an AI that can convincingly pretend to be unaware of things that are common knowledge for the sake of a bit.
(I'm 100% not joking. A system that a. knows what it knows; b. is able to make educated guesses about what I know; c. is able to use this information to identify what I think I know it knows; and d. proceeds to pretend not to know something I think I know it knows for the express purpose of fucking with me would would require a relatively sophisticated theory of mind, and I'm prepared to grant that anything that could do all that is probably engaging in something like thought.)