Skip to content
This repository has been archived by the owner on Oct 14, 2024. It is now read-only.

Commit

Permalink
Merge branch 'main' into shem/fix_pendulum_conversion_problem
Browse files Browse the repository at this point in the history
  • Loading branch information
shemogumbe authored Aug 23, 2024
2 parents da59b34 + b4de602 commit bc725b9
Show file tree
Hide file tree
Showing 17 changed files with 241 additions and 72 deletions.
2 changes: 1 addition & 1 deletion .github/CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
@@ -1 +1 @@
* @samwelkanda @baywet @darrelmiller @ddyett @andrueastman @silaskenneth @ndiritu
* @microsoft/kiota-write
8 changes: 8 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,14 @@ updates:
schedule:
interval: daily
open-pull-requests-limit: 10
groups:
open-telemetry:
patterns:
- "*opentelemetry*"
pylint:
patterns:
- "*pylint*"
- "*astroid*"
- package-ecosystem: github-actions
directory: "/"
schedule:
Expand Down
8 changes: 4 additions & 4 deletions .github/policies/resourceManagement.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ configuration:
- isIssue
- isOpen
- hasLabel:
label: 'Needs: Author Feedback'
label: 'status:waiting-for-author-feedback'
- hasLabel:
label: 'Status: No Recent Activity'
- noActivitySince:
Expand All @@ -31,7 +31,7 @@ configuration:
- isIssue
- isOpen
- hasLabel:
label: 'Needs: Author Feedback'
label: 'status:waiting-for-author-feedback'
- noActivitySince:
days: 4
- isNotLabeledWith:
Expand Down Expand Up @@ -64,13 +64,13 @@ configuration:
- isActivitySender:
issueAuthor: True
- hasLabel:
label: 'Needs: Author Feedback'
label: 'status:waiting-for-author-feedback'
- isOpen
then:
- addLabel:
label: 'Needs: Attention :wave:'
- removeLabel:
label: 'Needs: Author Feedback'
label: 'status:waiting-for-author-feedback'
description:
- if:
- payloadType: Issues
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/auto-merge-dependabot.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ jobs:
steps:
- name: Dependabot metadata
id: metadata
uses: dependabot/fetch-metadata@v1.6.0
uses: dependabot/fetch-metadata@v2.2.0
with:
github-token: "${{ secrets.GITHUB_TOKEN }}"

Expand Down
4 changes: 4 additions & 0 deletions .github/workflows/conflicting-pr-label.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,10 @@ on:
types: [synchronize]
branches: [ main ]

permissions:
pull-requests: write
contents: read

# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "build"
Expand Down
6 changes: 4 additions & 2 deletions .github/workflows/publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,12 @@ name: Publish package to PyPI and create release

on:
push:
branches: [main]
tags:
- "v*" # Push events to matching v*, i.e. v1.0, v20.15.10

permissions:
contents: write

jobs:
build:
uses: ./.github/workflows/build.yml
Expand Down Expand Up @@ -41,7 +43,7 @@ jobs:
uses: actions/checkout@v4
- name: Extract release notes
id: extract-release-notes
uses: ffurrer2/extract-release-notes@v1
uses: ffurrer2/extract-release-notes@v2
- name: Create release
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
Expand Down
25 changes: 25 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,31 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [1.3.0] - 2024-07-26

### Added

- Support `dict[str, Any]` and `list[dict[str, Any]]` when writing additional data.

### Changed

- Fixed a bug where date time deserialization would fail because of empty strings.
- Fixed a bug where float deserialization if the number represented qualified as an int.

## [1.2.0] - 2024-04-09

### Added

### Changed
- Enhanced error handling: Enabled silent failure when an enum key is not available

## [1.1.0] - 2024-02-29

### Added

### Changed
- Support objects and collections when writing additional data.

## [1.0.1] - 2023-12-16

### Added
Expand Down
1 change: 1 addition & 0 deletions CODE_OF_CONDUCT.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,4 @@ Resources:
- [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/)
- [Microsoft Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/)
- Contact [[email protected]](mailto:[email protected]) with questions or concerns
- Employees can reach out at [aka.ms/opensource/moderation-support](https://aka.ms/opensource/moderation-support)
2 changes: 1 addition & 1 deletion kiota_serialization_json/_version.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
VERSION: str = '1.0.1'
VERSION: str = '1.3.0'
14 changes: 9 additions & 5 deletions kiota_serialization_json/json_parse_node.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,9 +69,9 @@ def get_int_value(self) -> Optional[int]:
def get_float_value(self) -> Optional[float]:
"""Gets the float value of the json node
Returns:
float: The integer value of the node
float: The number value of the node
"""
return self._json_node if isinstance(self._json_node, float) else None
return float(self._json_node) if isinstance(self._json_node, (float, int)) else None

def get_uuid_value(self) -> Optional[UUID]:
"""Gets the UUID value of the json node
Expand All @@ -91,7 +91,11 @@ def get_datetime_value(self) -> Optional[datetime]:
"""
if isinstance(self._json_node, datetime):
return self._json_node

if isinstance(self._json_node, str):
if len(self._json_node) < 10:
return None

datetime_obj = pendulum.parse(self._json_node, exact=True)
if isinstance(datetime_obj, pendulum.DateTime):
return datetime_obj
Expand Down Expand Up @@ -200,7 +204,7 @@ def get_enum_value(self, enum_class: K) -> Optional[K]:
try:
return enum_class[camel_case_key] # type: ignore
except KeyError:
raise Exception(f'Invalid key: {camel_case_key} for enum {enum_class}.')
return None

def get_object_value(self, factory: ParsableFactory) -> U:
"""Gets the model object value of the node
Expand Down Expand Up @@ -310,11 +314,11 @@ def try_get_anything(self, value: Any) -> Any:
if isinstance(datetime_obj, pendulum.Duration):
return datetime_obj.as_timedelta()
return datetime_obj
except ValueError:
except:
pass
try:
return UUID(value)
except ValueError:
except:
pass
return value
raise ValueError(f"Unexpected additional value type {type(value)} during deserialization.")
Expand Down
1 change: 0 additions & 1 deletion kiota_serialization_json/json_parse_node_factory.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
import json
from io import BytesIO

from kiota_abstractions.serialization import ParseNode, ParseNodeFactory

Expand Down
99 changes: 70 additions & 29 deletions kiota_serialization_json/json_serialization_writer.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@

T = TypeVar("T")
U = TypeVar("U", bound=Parsable)
PRIMITIVE_TYPES = [bool, str, int, float, UUID, datetime, timedelta, date, time, bytes, Enum]


class JsonSerializationWriter(SerializationWriter):
Expand Down Expand Up @@ -202,7 +203,7 @@ def write_collection_of_primitive_values(
if isinstance(values, list):
result = []
for val in values:
temp_writer = self._create_new_writer()
temp_writer: JsonSerializationWriter = self._create_new_writer()
temp_writer.write_any_value(None, val)
result.append(temp_writer.value)

Expand Down Expand Up @@ -252,6 +253,28 @@ def write_collection_of_enum_values(
else:
self.value = result

def __write_collection_of_dict_values(
self, key: Optional[str], values: Optional[List[Dict[str, Any]]]
) -> None:
"""Writes the specified collection of dictionary values to the stream with an optional
given key.
Args:
key (Optional[str]): The key to be used for the written value. May be null.
values (Optional[List[Dict[str, Any]]]): The collection of dictionary values
to be written.
"""
if isinstance(values, list):
result = []
for val in values:
temp_writer: JsonSerializationWriter = self._create_new_writer()
temp_writer.__write_dict_value(None, val)
result.append(temp_writer.value)

if key:
self.writer[key] = result
else:
self.value = result

def write_bytes_value(self, key: Optional[str], value: bytes) -> None:
"""Writes the specified byte array as a base64 string to the stream with an optional
given key.
Expand Down Expand Up @@ -319,14 +342,29 @@ def write_null_value(self, key: Optional[str]) -> None:
else:
self.value = "null"

def __write_dict_value(self, key: Optional[str], value: Dict[str, Any]) -> None:
"""Writes the specified dictionary value to the stream with an optional given key.
Args:
key (Optional[str]): The key to be used for the written value. May be null.
value (Dict[str, Any]): The dictionary value to be written.
"""
if isinstance(value, dict):
temp_writer: JsonSerializationWriter = self._create_new_writer()
for dict_key, dict_value in value.items():
temp_writer.write_any_value(dict_key, dict_value)
if key:
self.writer[key] = temp_writer.writer
else:
self.value = temp_writer.writer

def write_additional_data_value(self, value: Dict[str, Any]) -> None:
"""Writes the specified additional data to the stream.
Args:
value (Dict[str, Any]): he additional data to be written.
"""
if isinstance(value, dict):
for key, val in value.items():
self.writer[key] = val
self.write_any_value(key, val)

def get_serialized_content(self) -> bytes:
"""Gets the value of the serialized content.
Expand Down Expand Up @@ -424,34 +462,37 @@ def write_any_value(self, key: Optional[str], value: Any) -> Any:
key (Optional[str]): The key to be used for the written value. May be null.
value Any): The value to be written.
"""
primitive_types = [bool, str, int, float, UUID, datetime, timedelta, date, time, Enum]
if value:
value_type = type(value)
if key:
if value_type in primitive_types:
method = getattr(self, f'write_{value_type.__name__.lower()}_value')
method(key, value)
elif isinstance(value, Parsable):
self.write_object_value(key, value)
elif hasattr(value, '__dict__'):
self.write_non_parsable_object_value(key, value)
else:
raise TypeError(
f"Encountered an unknown type during serialization {value_type} \
with key {key}"
)
value_type = type(value)
if value is None:
self.write_null_value(key)
elif value_type in PRIMITIVE_TYPES:
method = getattr(self, f'write_{value_type.__name__.lower()}_value')
method(key, value)
elif isinstance(value, Parsable):
self.write_object_value(key, value)
elif isinstance(value, list):
if all(isinstance(x, Parsable) for x in value):
self.write_collection_of_object_values(key, value)
elif all(isinstance(x, Enum) for x in value):
self.write_collection_of_enum_values(key, value)
elif all((type(x) in PRIMITIVE_TYPES) for x in value):
self.write_collection_of_primitive_values(key, value)
elif all(isinstance(x, dict) for x in value):
self.__write_collection_of_dict_values(key, value)
else:
if value_type in primitive_types:
method = getattr(self, f'write_{value_type.__name__.lower()}_value')
method(None, value)
elif isinstance(value, Parsable):
self.write_object_value(None, value)
elif hasattr(value, '__dict__'):
self.write_non_parsable_object_value(None, value)
else:
raise TypeError(
f"Encountered an unknown type during serialization {value_type}"
)
raise TypeError(
f"Encountered an unknown collection type during serialization \
{value_type} with key {key}"
)
elif isinstance(value, dict):
self.__write_dict_value(key, value)
elif hasattr(value, '__dict__'):
self.write_non_parsable_object_value(key, value)
else:
raise TypeError(
f"Encountered an unknown type during serialization {value_type} \
with key {key}"
)

def _serialize_value(self, temp_writer: JsonSerializationWriter, value: U):
if on_before := self.on_before_object_serialization:
Expand Down
Loading

0 comments on commit bc725b9

Please sign in to comment.