Extract
Extract

Fetch and render websites or data from URLs or local uploads for data extraction


or

Load
Load

Provide a database URL to read/write the extracted/transformed data and/or its vector embeddings under AI

retrieving database URL in serverside config...

Database providers

  1. Redis: Redis Cloud, Upstash
  2. PostgreSQL: Timescale, Neon

paste database URL anywhere to autofill

Go under "Transform" if you haven't, to define functions to alter the extracted data as you please before writing to database

Interact with database

( ,

or
)
errors please

eltai.loaded=result

errors please
Store your data in an account
Zero trust? That's fine, clone and run locally

Transform
Transform

Use scripts to modify data before
1. writing to database in Load
2. vectorizing it under AI
You can view and save the result as well

Transform functions

click on the user icon under "Load" to own an account for saving your data if you haven't already



  • /*Access global variables like `eltai.extracted` etc., here Ensure you return a value that can be stringified since HTTP data is mostly textual Boolean checks on the returned value ignore falsies */
0 Transform function(s)

AI
AI

Apply logic on vector embeddings generated from selected data for any purpose you wish
Create or view workflows and set them up to run timely

Choose AI model provider for embeddings

can be used to host Ollama models remotely
Optionally transform extracted data with
Setup AI provider or ignore if your choice is "default"
AI model

eltai.embedding=result

errors please
errors please

Extract data with scripts

0 KB


or
forEach(node=>{
/*avoids overriding by initializing only when it is a truthy eltai.extracted||=new Set, eltai.extracted||={} `node` is either "#textnode" or "input"*/ node.data?.trim()&&(eltai.extracted||=[]).push(node.data)
})