With millions of people relying on Artificial Intelligence (AI) tools such as Copilot, Gemini, and ChatGPT in their daily lives, evaluating the fairness of these systems and investigating their potential biases is crucial. The purpose of this research is to examine whether AI tools associate certain professions and hobbies with specific genders during storytelling, potentially perpetuating gender stereotypes. The study employs a dual approach: text-based and image-based storytelling are analyzed through controlled experiments. In the text analysis, patterns in gender-occupation pairings are identified, while the image analysis focuses on visual depictions of gender roles in professions and activities. Our findings show that all three AI tools consistently exhibit gender biases, linking particular jobs and hobbies with traditional gender roles in both textual and visual outputs. These biases could reinforce societal stereotypes, shaping users' perceptions of gender roles, especially in educational contexts. In conclusion, this study emphasizes the need for debiasing strategies to ensure AI tools foster inclusivity and fairness. Addressing these issues is essential for the equitable design of AI tools and crucial for creating an inclusive educational framework that empowers individuals to explore diverse identities and career paths.