You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on May 7, 2021. It is now read-only.
I know you haven't worked on this in a while but was wondering if you had any idea why I keep seeing this issue. I have added about 25 categories to the model with lots of data in each category. For the majority of the categories no matter what I feed in when I classify a chunk of text most of the categories return a probability of infinity.
Hello, yes, unfortunately there is no smoothing technique applied. PROD(P(featI|cat) becomes pretty big with lots of features and categories. You can however provide your own IFeatureProbability<T, K> calculator. This requires you to provide an own Classifier<T, K> though (or to override featuresProbabilityProduct(Collection<T> features, K category) in BayesClassifier<T, K>.
Hello,
I know you haven't worked on this in a while but was wondering if you had any idea why I keep seeing this issue. I have added about 25 categories to the model with lots of data in each category. For the majority of the categories no matter what I feed in when I classify a chunk of text most of the categories return a probability of infinity.
ex.
Classification[
category=friends_gatherings,
probability=Infinity,
featureset=[
after,
school,
soccerabout,
this,
...
--
]
]
The text was updated successfully, but these errors were encountered: