Kaizer supports multiple data loading methods, each tailored to different data types, volumes, and real-time requirements. Users can choose the method that best fits their application needs, whether they are working with large datasets, streaming data, or real-time inputs.
- Batch Uploads: Ideal for loading large datasets at once, batch uploads allow you to integrate data files directly into Kaizerโs storage and processing layers. Supported file formats include CSV, JSON, and Parquet.
- API-Based Data Loading: Use Kaizerโs REST APIs or SDKs to programmatically load data into your tables. This method is ideal for integrating external systems, IoT devices, or other data-generating platforms with Kaizer.
- Decentralized Storage Integration: Seamlessly load data from decentralized storage networks like IPFS, Arweave, or Filecoin into Kaizer. The platformโs connectors ensure that data is securely ingested and indexed for query operations.
- External Data Sources: Integrate data from traditional databases, cloud services, and third-party APIs. Kaizerโs platform provides connectors and ETL (Extract, Transform, Load) tools to facilitate this process, ensuring that external data is ingested with minimal friction and maximum security.