woanders ist es auch schön

1077249176256372738 1077249176256372738 - @vielmetti tough one
/via @vacuum #

@vielmetti reading about

  • feature extraction
  • bit vectors
  • association rule mining

and thinking about how these all have graph representations for predicting “interestingness” (or “non-interestingness”).

@vielmetti not as tough as it seems. you have a thing. it has a bunch of attributes (features) that you observe or derive, one bit per each. it’s drawn from a large universe of things which have lots of features, so this particular thing only has a few bits set in its bit vector.

@vielmetti one particular bit is the “interesting” bit, which is expensive to derive. you want to guess based on previous observations of other things (with each of their feature lists) whether this “interesting” bit should be set on a new sample.

@vielmetti You’d like it if there was one feature that you could extract that was a perfect predictor for interestingness, because then we could replace your machine learning problem with a very tiny shell script and a regular expression. That’s probably not going to happen.