Found this error in Making predictions?

ValueError: Found array with 0 feature(s) (shape=(1, 0)) while a minimum of 1 is required.

please give me solutions

Please share screenshots of the code/error.

I had same issue. I used four input columns and got error at this step.

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-105-eb2f8253e8b4> in <module>
----> 1 predicted_price = predict_input(sample_input)

<ipython-input-92-09cdc347837d> in predict_input(single_input)
      3     input_df[numeric_cols] = imputer.transform(input_df[numeric_cols])
      4     input_df[numeric_cols] = scaler.transform(input_df[numeric_cols])
----> 5     input_df[encoded_cols] = encoder.transform(input_df[categorical_cols].values)
      6     X_input = input_df[numeric_cols + encoded_cols]
      7     return model.predict(X_input)[0]

/opt/conda/lib/python3.9/site-packages/sklearn/preprocessing/_encoders.py in transform(self, X)
    469         check_is_fitted(self)
    470         # validation of X happens in _check_X called by _transform
--> 471         X_int, X_mask = self._transform(X, handle_unknown=self.handle_unknown,
    472                                         force_all_finite='allow-nan')
    473 

/opt/conda/lib/python3.9/site-packages/sklearn/preprocessing/_encoders.py in _transform(self, X, handle_unknown, force_all_finite)
    111 
    112     def _transform(self, X, handle_unknown='error', force_all_finite=True):
--> 113         X_list, n_samples, n_features = self._check_X(
    114             X, force_all_finite=force_all_finite)
    115 

/opt/conda/lib/python3.9/site-packages/sklearn/preprocessing/_encoders.py in _check_X(self, X, force_all_finite)
     42         if not (hasattr(X, 'iloc') and getattr(X, 'ndim', 0) == 2):
     43             # if not a dataframe, do normal check_array validation
---> 44             X_temp = check_array(X, dtype=None,
     45                                  force_all_finite=force_all_finite)
     46             if (not hasattr(X, 'dtype')

/opt/conda/lib/python3.9/site-packages/sklearn/utils/validation.py in inner_f(*args, **kwargs)
     61             extra_args = len(args) - len(all_args)
     62             if extra_args <= 0:
---> 63                 return f(*args, **kwargs)
     64 
     65             # extra_args > 0

/opt/conda/lib/python3.9/site-packages/sklearn/utils/validation.py in check_array(array, accept_sparse, accept_large_sparse, dtype, order, copy, force_all_finite, ensure_2d, allow_nd, ensure_min_samples, ensure_min_features, estimator)
    732         n_features = array.shape[1]
    733         if n_features < ensure_min_features:
--> 734             raise ValueError("Found array with %d feature(s) (shape=%s) while"
    735                              " a minimum of %d is required%s."
    736                              % (n_features, array.shape, ensure_min_features,

ValueError: Found array with 0 feature(s) (shape=(1, 0)) while a minimum of 1 is required.

Please check if you are passing a correct input_df or not.

Sorry. Didn’t quite understand what you are saying. Are we supposed to change the values in the sample_input values before running the code? But all the remaining code above it run perfectly. The problem starts at this part.

Edit: Found the issue. I should comment out the input_df[encoded_cols] = encoder.transform(input_df[categorical_cols].values) as this will give out error, since I didn’t used categorical input columns.

1 Like

I think you are getting errors because of .values at the end of input_df[encoded_cols] = encoder.transform(input_df[categorical_cols].values). Just remove the .values at the end and keep the rest part you won’t get any error. Also, you shouldn’t comment out that part cause you are using categorical columns in the sample input.

1 Like

Big Thanks. It works :grinning: :grinning: :grinning: