class documentation

Undocumented

Class Method multi_source_features_for Undocumented
Method all_between_dates Undocumented
Method all_data Undocumented
Async Method delete Undocumented
Method df_to_deltalake_compatible Undocumented
Method features_for Undocumented
Async Method freshness .table("my_table") .freshness()
Async Method insert Undocumented
Method job_group_key A key defining which sources can be grouped together in one request.
Async Method overwrite Undocumented
Async Method read_pandas Undocumented
Async Method schema Returns the schema for the data source
Async Method to_lazy_polars Undocumented
Async Method upsert Undocumented
Async Method write_pandas Undocumented
Async Method write_polars Undocumented
Class Variable config Undocumented
Class Variable date_formatter Undocumented
Class Variable mapping_keys Undocumented
Class Variable path Undocumented
Class Variable type_name Undocumented
Property storage Undocumented
Property to_markdown Undocumented

Inherited from CodableBatchDataSource:

Class Method _deserialize Undocumented
Method _serialize Undocumented

Inherited from BatchDataSource (via CodableBatchDataSource):

Method __hash__ Undocumented
Method all Undocumented
Method all_columns Undocumented
Method depends_on Undocumented
Async Method feature_view_code Setup the code needed to represent the data source as a feature view
Method filter Undocumented
Method location_id Undocumented
Method source_id An id that identifies a source from others.
Method tags Undocumented
Method transform_with_polars Undocumented
Method with_loaded_at Undocumented
Method with_view Undocumented

Inherited from ColumnFeatureMappable (via CodableBatchDataSource, BatchDataSource):

Method columns_for Undocumented
Method feature_identifier_for Undocumented
Method with_renames Undocumented
def all_between_dates(self, request: RetrivalRequest, start_date: datetime, end_date: datetime) -> RetrivalJob: (source)
def all_data(self, request: RetrivalRequest, limit: int | None) -> RetrivalJob: (source)
async def delete(self): (source)
def df_to_deltalake_compatible(self, df: pl.DataFrame, request: RetrivalRequest) -> tuple[pl.DataFrame, dict]: (source)

Undocumented

def features_for(self, facts: RetrivalJob, request: RetrivalRequest) -> RetrivalJob: (source)
async def freshness(self, feature: Feature) -> datetime | None: (source)

my_table_freshenss = await (PostgreSQLConfig("DB_URL")
.table("my_table") .freshness()

)

async def insert(self, job: RetrivalJob, request: RetrivalRequest): (source)
def job_group_key(self) -> str: (source)

A key defining which sources can be grouped together in one request.

async def overwrite(self, job: RetrivalJob, request: RetrivalRequest): (source)
async def read_pandas(self) -> pd.DataFrame: (source)

Undocumented

async def schema(self) -> dict[str, FeatureType]: (source)

Returns the schema for the data source

`python source = FileSource.parquet_at('test_data/titanic.parquet') schema = await source.schema() >>> {'passenger_id': FeatureType(name='int64'), ...} `

Returns:
dict[str, FeatureType]: A dictionary containing the column name and the feature type
async def to_lazy_polars(self) -> pl.LazyFrame: (source)

Undocumented

async def upsert(self, job: RetrivalJob, request: RetrivalRequest): (source)
async def write_pandas(self, df: pd.DataFrame): (source)

Undocumented

async def write_polars(self, df: pl.LazyFrame): (source)

Undocumented

date_formatter: DateFormatter = (source)

Undocumented

Undocumented

Undocumented

@property
to_markdown: str = (source)

Undocumented