Adds new vectors to the encrypted index or updates existing vectors if they have the same ID. This operation is the primary method for populating an index with vector data.
async upsert(items: VectorItem[]): Promise<SuccessResponseModel>

Parameters

ParameterTypeDescription
itemsVectorItem[]Array of vector items to insert or update in the index

VectorItem Structure

Each VectorItem must contain an id and may optionally include vector, contents, and metadata:
FieldTypeRequiredDescription
idstringYesUnique identifier for the vector item
vectornumber[]OptionalThe vector representation (required if no embedding model configured)
contentsstring | BufferOptionalOriginal text or binary content associated with the vector
metadataobjectOptionalAdditional structured data associated with the vector

Returns

Promise<SuccessResponseModel>: A Promise that resolves to a success response object containing the operation status and message with the count of upserted vectors.

Exceptions

Example Usage

Basic Vector Upsert

import { Client, IndexIVFModel, VectorItem } from 'cyborgdb';

const client = new Client('http://localhost:8000', 'your-api-key');

// Create index
const indexKey = crypto.getRandomValues(new Uint8Array(32));
const config: IndexIVFModel = {
    type: 'ivf',
    dimension: 768,
    nLists: 1024,
    metric: 'cosine'
};

const index = await client.createIndex('my-vectors', indexKey, config);

// Prepare vector items
const vectorItems: VectorItem[] = [
    {
        id: 'doc1',
        vector: [0.1, 0.2, 0.3, /* ... 768 dimensions */],
        contents: 'This is the content of the first document',
        metadata: {
            title: 'Introduction to Machine Learning',
            author: 'Dr. Smith',
            category: 'education',
            tags: ['ml', 'ai', 'tutorial'],
            published_date: '2024-01-15'
        }
    },
    {
        id: 'doc2',
        vector: [0.4, 0.5, 0.6, /* ... 768 dimensions */],
        contents: 'This is the content of the second document',
        metadata: {
            title: 'Advanced Neural Networks',
            author: 'Dr. Jones',
            category: 'research',
            tags: ['neural-networks', 'deep-learning'],
            published_date: '2024-01-20'
        }
    }
];

// Upsert vectors
try {
    const result = await index.upsert(vectorItems);
    console.log('Upsert result:', result.message);
    // Output: "Upserted 2 vectors"
    
    console.log(`Successfully added ${vectorItems.length} vectors to the index`);
} catch (error) {
    console.error('Upsert failed:', error.message);
}

Updating Existing Vectors

// Add initial vector
const initialVector: VectorItem = {
    id: 'updatable_doc',
    vector: [0.1, 0.2, 0.3, /* ... */],
    contents: 'Original content',
    metadata: { version: 1, status: 'draft' }
};

await index.upsert([initialVector]);

// Update the same vector with new data
const updatedVector: VectorItem = {
    id: 'updatable_doc', // Same ID - will update existing
    vector: [0.15, 0.25, 0.35, /* ... */], // Updated vector
    contents: 'Updated content with more information',
    metadata: { version: 2, status: 'published', updated_date: '2024-01-25' }
};

try {
    const updateResult = await index.upsert([updatedVector]);
    console.log('Update result:', updateResult.message);
    
    // Verify the update
    const retrievedVector = await index.get(['updatable_doc']);
    console.log('Updated vector:', retrievedVector[0]);
} catch (error) {
    console.error('Update failed:', error.message);
}

Upsert with Content & Metadata

interface DocumentData {
    id: string;
    title: string;
    content: string;
    author: string;
    category: string;
    vector: number[];
}

async function processBatchDocuments(
    index: EncryptedIndex, 
    documents: DocumentData[]
) {
    // Convert documents to VectorItem format
    const vectorItems: VectorItem[] = documents.map(doc => ({
        id: doc.id,
        vector: doc.vector,
        contents: doc.content,
        metadata: {
            title: doc.title,
            author: doc.author,
            category: doc.category,
            word_count: doc.content.split(' ').length,
            char_count: doc.content.length,
            processed_date: new Date().toISOString()
        }
    }));
    
    try {
        console.log(`Processing batch of ${vectorItems.length} documents`);
        
        const result = await index.upsert(vectorItems);
        console.log('Batch upsert completed:', result.message);
        
        return {
            success: true,
            count: vectorItems.length,
            message: result.message
        };
    } catch (error) {
        console.error('Batch upsert failed:', error.message);
        return {
            success: false,
            error: error.message
        };
    }
}

// Usage
const documents: DocumentData[] = [
    {
        id: 'article_001',
        title: 'Getting Started with TypeScript',
        content: 'TypeScript is a powerful superset of JavaScript...',
        author: 'Jane Developer',
        category: 'programming',
        vector: [0.1, 0.2, /* ... */]
    },
    {
        id: 'article_002',
        title: 'React Best Practices',
        content: 'When building React applications, it is important...',
        author: 'John React',
        category: 'frontend',
        vector: [0.3, 0.4, /* ... */]
    }
];

const batchResult = await processBatchDocuments(index, documents);