Skip to content

chore: update SDK settings #221

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 5, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ $ pip install -r requirements-dev.lock
## Modifying/Adding code

Most of the SDK is generated code, and any modified code will be overridden on the next generation. The
`src/openlayer-test/lib/` and `examples/` directories are exceptions and will never be overridden.
`src/openlayer/lib/` and `examples/` directories are exceptions and will never be overridden.

## Adding and running examples

Expand Down
173 changes: 95 additions & 78 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Openlayer Python API library

[![PyPI version](https://img.shields.io/pypi/v/openlayer-test.svg)](https://pypi.org/project/openlayer-test/)
[![PyPI version](https://img.shields.io/pypi/v/openlayer.svg)](https://pypi.org/project/openlayer/)

The Openlayer Python library provides convenient access to the Openlayer REST API from any Python 3.7+
application. The library includes type definitions for all request params and response fields,
Expand All @@ -16,7 +16,7 @@ The REST API documentation can be found [on openlayer.com](https://openlayer.com

```sh
# install from PyPI
pip install --pre openlayer-test
pip install --pre openlayer
```

## Usage
Expand All @@ -25,7 +25,7 @@ The full API of this library can be found in [api.md](api.md).

```python
import os
from openlayer-test import Openlayer
from openlayer import Openlayer

client = Openlayer(
# This is the default and can be omitted
Expand All @@ -41,13 +41,15 @@ data_stream_response = client.inference_pipelines.data.stream(
"cost_column_name": "cost",
"timestamp_column_name": "timestamp",
},
rows=[{
"user_query": "what's the meaning of life?",
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
}],
rows=[
{
"user_query": "what's the meaning of life?",
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
}
],
)
print(data_stream_response.success)
```
Expand All @@ -64,32 +66,36 @@ Simply import `AsyncOpenlayer` instead of `Openlayer` and use `await` with each
```python
import os
import asyncio
from openlayer-test import AsyncOpenlayer
from openlayer import AsyncOpenlayer

client = AsyncOpenlayer(
# This is the default and can be omitted
api_key=os.environ.get("OPENLAYER_API_KEY"),
)


async def main() -> None:
data_stream_response = await client.inference_pipelines.data.stream(
"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
config={
"input_variable_names": ["user_query"],
"output_column_name": "output",
"num_of_token_column_name": "tokens",
"cost_column_name": "cost",
"timestamp_column_name": "timestamp",
},
rows=[{
"user_query": "what's the meaning of life?",
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
}],
)
print(data_stream_response.success)
data_stream_response = await client.inference_pipelines.data.stream(
"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
config={
"input_variable_names": ["user_query"],
"output_column_name": "output",
"num_of_token_column_name": "tokens",
"cost_column_name": "cost",
"timestamp_column_name": "timestamp",
},
rows=[
{
"user_query": "what's the meaning of life?",
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
}
],
)
print(data_stream_response.success)


asyncio.run(main())
```
Expand All @@ -107,16 +113,16 @@ Typed requests and responses provide autocomplete and documentation within your

## Handling errors

When the library is unable to connect to the API (for example, due to network connection problems or a timeout), a subclass of `openlayer-test.APIConnectionError` is raised.
When the library is unable to connect to the API (for example, due to network connection problems or a timeout), a subclass of `openlayer.APIConnectionError` is raised.

When the API returns a non-success status code (that is, 4xx or 5xx
response), a subclass of `openlayer-test.APIStatusError` is raised, containing `status_code` and `response` properties.
response), a subclass of `openlayer.APIStatusError` is raised, containing `status_code` and `response` properties.

All errors inherit from `openlayer-test.APIError`.
All errors inherit from `openlayer.APIError`.

```python
import openlayer-test
from openlayer-test import Openlayer
import openlayer
from openlayer import Openlayer

client = Openlayer()

Expand All @@ -130,20 +136,22 @@ try:
"cost_column_name": "cost",
"timestamp_column_name": "timestamp",
},
rows=[{
"user_query": "what's the meaning of life?",
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
}],
rows=[
{
"user_query": "what's the meaning of life?",
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
}
],
)
except openlayer-test.APIConnectionError as e:
except openlayer.APIConnectionError as e:
print("The server could not be reached")
print(e.__cause__) # an underlying Exception, likely raised within httpx.
except openlayer-test.RateLimitError as e:
print(e.__cause__) # an underlying Exception, likely raised within httpx.
except openlayer.RateLimitError as e:
print("A 429 status code was received; we should back off a bit.")
except openlayer-test.APIStatusError as e:
except openlayer.APIStatusError as e:
print("Another non-200-range status code was received")
print(e.status_code)
print(e.response)
Expand Down Expand Up @@ -171,7 +179,7 @@ Connection errors (for example, due to a network connectivity problem), 408 Requ
You can use the `max_retries` option to configure or disable retry settings:

```python
from openlayer-test import Openlayer
from openlayer import Openlayer

# Configure the default for all requests:
client = Openlayer(
Expand All @@ -180,7 +188,7 @@ client = Openlayer(
)

# Or, configure per-request:
client.with_options(max_retries = 5).inference_pipelines.data.stream(
client.with_options(max_retries=5).inference_pipelines.data.stream(
"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
config={
"input_variable_names": ["user_query"],
Expand All @@ -189,13 +197,15 @@ client.with_options(max_retries = 5).inference_pipelines.data.stream(
"cost_column_name": "cost",
"timestamp_column_name": "timestamp",
},
rows=[{
"user_query": "what's the meaning of life?",
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
}],
rows=[
{
"user_query": "what's the meaning of life?",
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
}
],
)
```

Expand All @@ -205,7 +215,7 @@ By default requests time out after 1 minute. You can configure this with a `time
which accepts a float or an [`httpx.Timeout`](https://www.python-httpx.org/advanced/#fine-tuning-the-configuration) object:

```python
from openlayer-test import Openlayer
from openlayer import Openlayer

# Configure the default for all requests:
client = Openlayer(
Expand All @@ -219,7 +229,7 @@ client = Openlayer(
)

# Override per-request:
client.with_options(timeout = 5.0).inference_pipelines.data.stream(
client.with_options(timeout=5.0).inference_pipelines.data.stream(
"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
config={
"input_variable_names": ["user_query"],
Expand All @@ -228,13 +238,15 @@ client.with_options(timeout = 5.0).inference_pipelines.data.stream(
"cost_column_name": "cost",
"timestamp_column_name": "timestamp",
},
rows=[{
"user_query": "what's the meaning of life?",
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
}],
rows=[
{
"user_query": "what's the meaning of life?",
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
}
],
)
```

Expand Down Expand Up @@ -271,7 +283,7 @@ if response.my_field is None:
The "raw" Response object can be accessed by prefixing `.with_raw_response.` to any HTTP method call, e.g.,

```py
from openlayer-test import Openlayer
from openlayer import Openlayer

client = Openlayer()
response = client.inference_pipelines.data.with_raw_response.stream(
Expand All @@ -297,9 +309,9 @@ data = response.parse() # get the object that `inference_pipelines.data.stream(
print(data.success)
```

These methods return an [`APIResponse`](https://github.com/openlayer-ai/openlayer-python/tree/main/src/openlayer-test/_response.py) object.
These methods return an [`APIResponse`](https://github.com/openlayer-ai/openlayer-python/tree/main/src/openlayer/_response.py) object.

The async client returns an [`AsyncAPIResponse`](https://github.com/openlayer-ai/openlayer-python/tree/main/src/openlayer-test/_response.py) with the same structure, the only difference being `await`able methods for reading the response content.
The async client returns an [`AsyncAPIResponse`](https://github.com/openlayer-ai/openlayer-python/tree/main/src/openlayer/_response.py) with the same structure, the only difference being `await`able methods for reading the response content.

#### `.with_streaming_response`

Expand All @@ -317,18 +329,20 @@ with client.inference_pipelines.data.with_streaming_response.stream(
"cost_column_name": "cost",
"timestamp_column_name": "timestamp",
},
rows=[{
"user_query": "what's the meaning of life?",
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
}],
) as response :
print(response.headers.get('X-My-Header'))
rows=[
{
"user_query": "what's the meaning of life?",
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
}
],
) as response:
print(response.headers.get("X-My-Header"))

for line in response.iter_lines():
print(line)
print(line)
```

The context manager is required so that the response will reliably be closed.
Expand Down Expand Up @@ -377,12 +391,15 @@ You can directly override the [httpx client](https://www.python-httpx.org/api/#c
- Additional [advanced](https://www.python-httpx.org/advanced/#client-instances) functionality

```python
from openlayer-test import Openlayer, DefaultHttpxClient
from openlayer import Openlayer, DefaultHttpxClient

client = Openlayer(
# Or use the `OPENLAYER_BASE_URL` env var
base_url="http://my.test.server.example.com:8083",
http_client=DefaultHttpxClient(proxies="http://my.test.proxy.example.com", transport=httpx.HTTPTransport(local_address="0.0.0.0")),
http_client=DefaultHttpxClient(
proxies="http://my.test.proxy.example.com",
transport=httpx.HTTPTransport(local_address="0.0.0.0"),
),
)
```

Expand Down
Loading