Strategic Plan
Our roadmap for building Scheller into a destination for applied AI in business: where the Center is heading, how we plan to get there, and how we will know it is working.
Mission and Vision
Mission
Vision
Scheller is the institution business leaders turn to for applied AI: graduates who deliver AI value from day one, faculty research that shapes how companies adopt and govern AI, and a partner network that defines the standard for AI in business.
Unique Value Proposition
The Center for AI in Business combines Georgia Tech's engineering leadership with Scheller's business expertise to provide what no pure engineering school or standalone business school can: applied AI training grounded in both technical rigor and real-world business judgment.
Strategic Goals
Become the hub for applied AI projects and practicums
Serve as the central gateway for AI-related practicums and experiential projects across Scheller, matching student and faculty expertise with partner needs and ensuring consistent project scoping, delivery quality, and measurable impact.
What this looks like:
- A structured intake process for partner opportunities
- Clear project scoping, governance, and success metrics
- Repeatable practicum models that scale across programs and partners
- High-quality deliverables that partners can act on and students can showcase
Build and sustain Scheller's applied AI capability portfolio
Develop a cohesive suite of applied AI capabilities and learning assets that demonstrate Scheller's expertise and enable students to build real systems, not just concepts.
What this looks like:
- A curated set of AI applications, methods, and use-case playbooks
- Applied toolkits for evaluation, deployment readiness, and monitoring
- Case-based modules that translate current AI practice into the classroom
- Demonstrations that reflect real business constraints (data, controls, ROI, governance)
Build and sustain Scheller's applied AI capability portfolio
Develop a cohesive suite of applied AI capabilities and learning assets that demonstrate Scheller's expertise and enable students to build real systems, not just concepts.
What this looks like:
- A curated set of AI applications, methods, and use-case playbooks
- Applied toolkits for evaluation, deployment readiness, and monitoring
- Case-based modules that translate current AI practice into the classroom
- Demonstrations that reflect real business constraints (data, controls, ROI, governance)
Lead responsible AI and agent-ready operating models for business
Equip students and industry leaders to manage AI responsibly, especially as agentic systems become integrated into business processes.
What this looks like:
- Practical guidance on AI governance, controls, and risk management
- Methods for designing human-in-the-loop workflows
- Evaluation practices for reliability, safety, bias, and performance drift
- Operating model frameworks for AI adoption at scale (people, process, technology)
Strengthen industry engagement and fundraising through credible demonstrations
Showcase applied AI work in ways that resonate with industry partners and donors, positioning Scheller as a trusted source of insight and impact.
What this looks like:
- A public-facing showcase of AI applications, case studies, and outcomes
- Events that connect partners, students, and faculty around live demonstrations
- A narrative portfolio that ties AI capability to workforce outcomes and societal relevance
- Partnership pathways that evolve from projects to sustained collaboration and support
Key Initiatives
Practicum and Experiential Hub
- Launch a consistent project intake, scoping, and delivery model
- Establish a partner project pipeline with defined tiers and expectations
- Create reusable templates for project governance, data access, evaluation, and final reporting
Applied AI Capability and Learning Assets
- Define an applied AI competency map, including agentic AI literacy
- Develop core learning assets (cases, exercises, toolkits) aligned to competencies
- Build a small set of flagship demonstrations that can be expanded over time
Assurance of learning
- Implement standardized rubrics for applied AI outcomes
- Enable student portfolios of applied artifacts (deliverables, documentation, evaluation)
- Produce annual reporting that summarizes outcomes and improvement actions
Partner and Donor Engagement
- Publish a web-based showcase of Center work and applied demonstrations
- Host recurring events for partners and alumni (showcase, speaker series, demo day)
- Create sponsorship and philanthropic pathways tied to signature programs
Measures of Success
Growth
Growth in high-quality practicums and partner renewals
Participation
Student participation and placement outcomes tied to AI roles and capabilities
Adoption
Adoption of Center learning assets across programs and courses
Outcomes
Documented assurance-of-learning outcomes aligned to AI readiness goals
Visibility
Visibility and engagement with the Center's applied AI showcase
Funding
Sustained funding growth through partnerships and philanthropy