Regression doesn't seem to work :[

A place to discuss the science of computers and programs, from algorithms to computability.

Formal proofs preferred.

Moderators: phlip, Moderators General, Prelates

Posts: 42
Joined: Thu Oct 08, 2015 4:56 am UTC

Regression doesn't seem to work :[

Postby jacques01 » Fri Jan 06, 2017 11:59 pm UTC

I have a dataset of about 12,000 instances (X). Each instance has 6,000 real valued features. Each instance is associated with a real value that ranges from 0.0 to about 7.0 (Y). I am trying to predict each X's Y's value for new instance of X for which I'll have the features but not the associated Y value.

I have tried several linear models in Scikitlearn: Linear regression, Lasso Regression, Ridge Regression, and a Multilayer Perceptron Regression.

Doing repeated 10 cross-folds validation, I only get a mediocre R2 of no more than around 0.60 (1.0 being perfect, and 0.0 being as good as expected). My absolute mean squared error is mediocre too, hovering around 0.53. I would prefer a mean squared error of closer to 0.20 or lower.

What can I do to get better results? Could it be that my features are not good enough to capture the relation?

A professor I once knew told me that the hardest problems are the ones that don't have a solution, i.e. trying to prove something which is actually false. I wonder if the features I'm using don't actually relate to the output, e.g. it's like trying to predict someone's income using the exact seconds when they had all their adult teeth in place as a child. Or predicting a home's value based on the population of squirrels in the neighborhood. I suspect neither of those will get you very far in the respective problems. Could this be my problem, and if so, how do I prove it?

Return to “Computer Science”

Who is online

Users browsing this forum: benneh and 4 guests