Skip to content

embedding dimension change.  #84

Open
@xiexbing

Description

@xiexbing

hello, I finished the first run impl of the tzrec based rec training. thanks for the help along the way. one thing I don't understand is, I tried to use embedding group to manage the tables, and I tried build the features as IdFeature and lookupfeature, but no matter how I configure my embedding_dim in the config, the table dimension will be 4 when I check the sharding plan. is this correct? please see my embedding config below.

feature_configs = []
# Initialize RawFeature instances for each feature
for emb_bag_name, emb_bag_config in embedding_bag_configs.items():
    feature_names = emb_bag_config.feature_names
    for feature_name in feature_names:
        feature_configs.append(
                feature_pb2.FeatureConfig(
                    lookup_feature=feature_pb2.LookupFeature(
                        feature_name=feature_name,
                        embedding_name=emb_bag_config.name,
                        embedding_dim=128,
                        pooling='mean',
                        num_buckets=emb_bag_config.num_embeddings
                        )
                    )
                )

    # Create features
features = create_features(feature_configs)
# Define feature groups
feature_groups = [
    model_pb2.FeatureGroupConfig(
        group_name="wide",
        feature_names=embedding_features,
        group_type=model_pb2.FeatureGroupType.WIDE,
    ),
]

# Initialize EmbeddingGroup
embedding_group = EmbeddingGroup(features, feature_groups, device=device)
return embedding_group, embedding_features

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions