I am using Jupyter notebook and python 2.7 from anaconda. I have an approximately 250,000 dimensional data set which I need to compress to n lower dimensions. I am using scikit TSNE. When running the TSNE for
"The kernel appears to have died."
tsne = sklearn.manifold.TSNE(n_components=n,verbose=2)
calculate stuff from data_nd_tsne
I have found a solution using
python-bhtsne which is also an implementation of Barnes-Hut's t-Distributed Stochastic Neighbor Embedding approach.
It is very easy to implement and even provides an option to get the same output in every run of
tsne with the same parameters - something that is absent in the
It is a python wrapper for the original implementation by Laurens van der Maaten.
So basically you'll just need to do the following instead of the regular
from bhtsne import tsne data_nd_tsne = tsne(diff_df)