Get Started

Install expo-ai-kit and run your first on-device AI query in minutes.

Prerequisites

Before you begin, make sure you have:

  • An Expo project using SDK 52 or later
  • For iOS testing: A physical device with Apple Intelligence support
  • Xcode 16 or later (for iOS development)

Expo SDK Requirement

expo-ai-kit requires Expo SDK 52 or later. If you're using an older SDK version, you'll need to upgrade your project first.

Installation

Install expo-ai-kit using your preferred package manager:

Terminalbash
npx expo install expo-ai-kit

Or with npm/yarn:

Terminalbash
npm install expo-ai-kit
# or
yarn add expo-ai-kit

iOS Setup

iOS For iOS, you need to configure your app to use the Foundation Models framework. Add the following to your app.json or app.config.js:

app.jsonjson
{
  "expo": {
    "plugins": [
      [
        "expo-ai-kit",
        {
          "enableFoundationModels": true
        }
      ]
    ]
  }
}

After configuring, rebuild your app:

Terminalbash
npx expo prebuild --clean
npx expo run:ios

Simulator Limitations

On-device AI features are not available in the iOS Simulator. You must test on a physical device that supports Apple Intelligence.

Basic Usage

Checking Availability

Always check if on-device AI is available before attempting to use it. Availability depends on the device hardware and OS version.

utils/ai.tstypescript
import { isAvailable } from 'expo-ai-kit';

export async function checkAISupport() {
  const available = await isAvailable();

  if (!available) {
    // Handle gracefully - show fallback UI or use cloud AI
    console.log('On-device AI not available');
    return false;
  }

  return true;
}

Sending Messages

Once you've confirmed availability, you can create a session and send messages:

App.tsxtypescript
import { createSession, sendMessage } from 'expo-ai-kit';

async function getAIResponse(userMessage: string) {
  // Create a new session
  const session = await createSession();

  // Send the message and wait for a response
  const response = await sendMessage(session, {
    message: userMessage,
  });

  return response.text;
}

// Usage
const answer = await getAIResponse('Explain quantum computing in simple terms');
console.log(answer);

Streaming Responses

For a better user experience, you can stream responses token-by-token:

ChatScreen.tsxtypescript
import { createSession, sendMessage } from 'expo-ai-kit';
import { useState } from 'react';

function useChatAI() {
  const [response, setResponse] = useState('');
  const [isLoading, setIsLoading] = useState(false);

  async function sendChatMessage(message: string, session: Session) {
    setIsLoading(true);
    setResponse('');

    await sendMessage(session, {
      message,
      onToken: (token) => {
        // Update response as each token arrives
        setResponse((prev) => prev + token);
      },
    });

    setIsLoading(false);
  }

  return { response, isLoading, sendChatMessage };
}

You're Ready!

You now have expo-ai-kit installed and configured. The library handles all the complexity of interfacing with platform-native AI frameworks.

Next Steps

Now that you have the basics working, explore more features: