Skip to content

Releases: Aleph-Alpha/aleph-alpha-client

2.2.1

11 Aug 09:42
4e04f45
Compare
Choose a tag to compare

What's Changed

Bugfix

  • Restore original error handling of HTTP status codes to before 2.2.0
  • Add dedicated exception BusyError for status code 503

Full Changelog: v2.2.0...v2.2.1

v2.2.0

10 Aug 15:14
af0cb94
Compare
Choose a tag to compare

What's Changed

New feature

  • Retry failed HTTP requests via urllib for status codes 408, 429, 500, 502, 503, 504

Full Changelog: v2.1.0...v2.2.0

2.1.0

10 Aug 11:56
045b6f0
Compare
Choose a tag to compare

What's Changed

New feature

  • Add new parameters to control how repetition penalties are applied for completion requests (see docs for more information):
    • penalty_bias
    • penalty_exceptions
    • penalty_exceptions_include_stop_sequences

Full Changelog: 2.0.0...v2.1.0

2.0.0

08 Aug 12:23
f4a6937
Compare
Choose a tag to compare

What's Changed

Breaking change

  • Make hosting parameter optional in semantic_embed on client. Changed order of parameters hosting and request. Should not be an issue if you're not using semantic_embed from the client directly or if you're using keyword args.

Experimental feature

  • Add experimental penalty parameters for completion

Full Changelog: v1.7.1...2.0.0

v1.7.1 - Improved handling of text-based Documents in Q&A

01 Aug 09:26
40628fd
Compare
Choose a tag to compare

What's Changed

  • Improved handling of text-based Documents in Q&A

Full Changelog: v1.7.0...v1.7.1

Semantic embedding

28 Jul 14:33
Compare
Choose a tag to compare

What's Changed

  • Add semantic embedding
  • Add optional parameter timeout client

Full Changelog: v1.6.0...v1.7.0

Examples

Semantic embedding

Symmetric

from typing import Sequence
from aleph_alpha_client import ImagePrompt, AlephAlphaClient, AlephAlphaModel, SemanticEmbeddingRequest, SemanticRepresentation, Prompt
import math
import os

model = AlephAlphaModel(
    AlephAlphaClient(host="https://api.aleph-alpha.com", token=os.getenv("AA_TOKEN")),
    # You need to choose a model with multimodal capabilities for this example.
    model_name = "luminous-base"
)

# Texts to compare
texts = [
    "deep learning",
    "artificial intelligence",
    "deep diving",
    "artificial snow",
]

embeddings = []

for text in texts:
    request = SemanticEmbeddingRequest(prompt=Prompt.from_text(text), representation=SemanticRepresentation.Symmetric)
    result = model.semantic_embed(request)
    embeddings.append(result.embedding)

# Calculate cosine similarities. Can use numpy or scipy or another library to do this
def cosine_similarity(v1: Sequence[float], v2: Sequence[float]) -> float:
    "compute cosine similarity of v1 to v2: (v1 dot v2)/{||v1||*||v2||)"
    sumxx, sumxy, sumyy = 0, 0, 0
    for i in range(len(v1)):
        x = v1[i]; y = v2[i]
        sumxx += x*x
        sumyy += y*y
        sumxy += x*y
    return sumxy/math.sqrt(sumxx*sumyy)
# Cosine similarities are in [-1, 1]. Higher means more similar
print("Cosine similarity between \"%s\" and \"%s\" is: %.3f" % (texts[0], texts[1], cosine_similarity(embeddings[0], embeddings[1])))
print("Cosine similarity between \"%s\" and \"%s\" is: %.3f" % (texts[0], texts[2], cosine_similarity(embeddings[0], embeddings[2])))
print("Cosine similarity between \"%s\" and \"%s\" is: %.3f" % (texts[0], texts[3], cosine_similarity(embeddings[0], embeddings[3])))

Documents and Query

from typing import Sequence
from aleph_alpha_client import ImagePrompt, AlephAlphaClient, AlephAlphaModel, SemanticEmbeddingRequest, SemanticRepresentation, Prompt
import math
import os

model = AlephAlphaModel(
    AlephAlphaClient(host="https://api.aleph-alpha.com", token=os.getenv("AA_TOKEN")),
    # You need to choose a model with multimodal capabilities for this example.
    model_name = "luminous-base"
)

# Documents to search in
documents = [
    # AI wikipedia article
    "Artificial intelligence (AI) is intelligence demonstrated by machines, as opposed to the natural intelligence displayed by animals including humans. AI research has been defined as the field of study of intelligent agents, which refers to any system that perceives its environment and takes actions that maximize its chance of achieving its goals.",
    # Deep Learning Wikipedia article
    "Deep learning (also known as deep structured learning) is part of a broader family of machine learning methods based on artificial neural networks with representation learning. Learning can be supervised, semi-supervised or unsupervised.",
    # Deep Diving Wikipedia article
    "Deep diving is underwater diving to a depth beyond the norm accepted by the associated community. In some cases this is a prescribed limit established by an authority, while in others it is associated with a level of certification or training, and it may vary depending on whether the diving is recreational, technical or commercial. Nitrogen narcosis becomes a hazard below 30 metres (98 ft) and hypoxic breathing gas is required below 60 metres (200 ft) to lessen the risk of oxygen toxicity.",
]
# Keyword to search documents with
query = "artificial intelligence"

# Embed Query
request = SemanticEmbeddingRequest(prompt=Prompt.from_text(query), representation=SemanticRepresentation.Query)
result = model.semantic_embed(request)
query_embedding = result.embedding

# Embed documents
document_embeddings = []

for document in documents:
    request = SemanticEmbeddingRequest(prompt=Prompt.from_text(document), representation=SemanticRepresentation.Document)
    result = model.semantic_embed(request)
    document_embeddings.append(result.embedding)

# Calculate cosine similarities. Can use numpy or scipy or another library to do this
def cosine_similarity(v1: Sequence[float], v2: Sequence[float]) -> float:
    "compute cosine similarity of v1 to v2: (v1 dot v2)/{||v1||*||v2||)"
    sumxx, sumxy, sumyy = 0, 0, 0
    for i in range(len(v1)):
        x = v1[i]; y = v2[i]
        sumxx += x*x
        sumyy += y*y
        sumxy += x*y
    return sumxy/math.sqrt(sumxx*sumyy)
# Cosine similarities are in [-1, 1]. Higher means more similar
print("Cosine similarity between \"%s\" and \"%s...\" is: %.3f" % (query, documents[0][:10], cosine_similarity(query_embedding, document_embeddings[0])))
print("Cosine similarity between \"%s\" and \"%s...\" is: %.3f" % (query, documents[1][:10], cosine_similarity(query_embedding, document_embeddings[1])))
print("Cosine similarity between \"%s\" and \"%s...\" is: %.3f" % (query, documents[2][:10], cosine_similarity(query_embedding, document_embeddings[2])))

Introduce AlephAlphaModel as alternative to direct usage of AlephAlphaClient

28 Jun 07:40
Compare
Choose a tag to compare

Highlights of this release

  • Introduce AlephAlphaModel as a more convenient alternative to direct usage of AlephAlphaClient

Breaking Changes

  • None

Full Changelog

v1.5.0...v1.6.0

Version 1.5.0

15 Jun 13:20
Compare
Choose a tag to compare
v1.5.0

bump version to 1.5.0

Version 1.4.2

16 May 11:39
Compare
Choose a tag to compare

Remove dependency on setuptools_scm and importlib.metadata. #24

Version 1.4.1

04 May 09:14
Compare
Choose a tag to compare

fix: Q&A docs examples