If you are having 4GB RAM in your machine and you want to train your model on 10GB data set. How would you go about this problem.
If you are having 4GB RAM in your machine and you want to train your model on 10GB data set. How would you go about this problem
First of all you have to ask which ML model you want to train.
For Neural networks: Batch size with Numpy array will work.
- Load the whole data in Numpy array. Numpy array has property to create mapping of complete data set, it doesn’t load complete data set in memory.
- You can pass index to Numpy array to get required data.
- Use this data to pass to Neural network.
- Have small batch size.
For SVM: Partial fit will work
- Divide one big data set in small size data sets.
- Use partial fit method of SVM, it requires subset of complete data set.
- Repeat step 2 for other subsets.
for more interview qns on ML do read the below artice: