Use this file to discover all available pages before exploring further.
If you’ve previously created an encrypted index, you can connect to it to add, query or delete data from it. You will need to know the index’s name as well as its key to do so:
import cyborgdb_core as cyborgdbimport secrets# Using `memory` storage for this exampleindex_location = cyborgdb.DBConfig("memory") config_location = cyborgdb.DBConfig("memory")# Get your API keyapi_key = "your_api_key_here" # Replace with your actual API key# Create a clientclient = cyborgdb.Client( api_key=api_key, index_location=index_location, config_location=config_location)# Provide the index key used when creating the indexindex_key = secrets.token_bytes(32)# Create an encrypted indexindex = client.load_index( index_name="my_index", index_key=index_key)
You will need to replace index_key with your own index encryption key.
For production use, we recommend that you use an HSM or KMS solution.
For more details, see Managing Encryption Keys.
For improved query performance, you can enable encrypted index caching by setting a max_cache_size:
import cyborgdb_core as cyborgdbimport secrets# Using `memory` storage for this exampleindex_location = cyborgdb.DBConfig("memory") config_location = cyborgdb.DBConfig("memory")# Get your API keyapi_key = "your_api_key_here" # Replace with your actual API key# Create a clientclient = cyborgdb.Client( api_key=api_key, index_location=index_location, config_location=config_location)# Generate an encryption key for the indexindex_key = secrets.token_bytes(32)# Set max cache size at 1MBmax_cache_size = 1000000# Create an encrypted indexindex = client.load_index( index_name="my_index", index_key=index_key, max_cache_size=max_cache_size)
The maximum cache size is specified in megabytes. When the cache reaches this size, the least recently used entries will be evicted.