Tag: Vector Databases
-
How LLMs give semantic meaning to a prompt| ngrok blog
In summary, embeddings are points in n-dimensional space that you can think of as the semantic meaning of the text they represent. During training, each token gets moved within this space to be close to other, similar tokens. The more dimensions, the more complex and nuanced the LLM’s representation of each token can be. —…
-
Scaling Hierarchical Navigable Small World (HNSW) Vector in Redis
if you have different vectors about the same use case split in different instances / keys, you can ask VSIM for the same query vector into all the instances, and add the WITHSCORES option (that returns the cosine distance) and merge the results client-side, and you have magically scaled your hundred of millions of vectors…