Week 6: Implementation of RFF

 This week I implemented the RFF method and compared results with the RKHS and ProbSpace methods. Here is a day-wise summary of my progress throughout the week:

Monday:

Discussed some possible methods of multiple variable conditioning with Roger sir during our weekly meets. It is surprisingly hard to find papers on the topic and after many attempts to find one that could help what we are trying to achieve and failing, it seems like we will have to implement this feature by ourselves. We can take inspiration from how ProbSpace does it and extend it to kernel methods if possible, else we can try a hybrid of the two. 

        Roger sir was also looking into using RKHS to rebuild a probability distribution curve from the filtered data points(the Probspace method of doing conditional probabilities is essentially filtering the data repeatedly on each conditional variable value). This could help us solve the dwindling data points problem that occurs upon repeated filtering. I studied some conditional probability theory to see if there was some obvious method we were missing but conditional probability calculations are tightly tied to joint probability distributions so, no luck there.


Tuesday:

Upon Roger sir’s suggestion, I have shifted the objectives of my project as the quest for implementing multiple conditioning is at a dead end as of now. I will be focussing on 3 separate things going forward involving modification of Probspace methods, integration of RKHS into Probspace methods and implementing Random Fourier Features and testing the accuracy/results of all of these. Started working on implementation of RFFs today based on this blog.

Wednesday:

Implemented and tested the RFF method. Average Errors comparison between the three methods:

Probspace :  10.758674242461845

RKHS :        0.8640715047275058

RFF :            4.46337896

Here is a comparison graph between RKHS,RFF and ProbSpace methods:

Thursday & Friday:

I Compared the time taken for execution between the three methods. The RFF method also depends upon the number of features selected and the ‘sigma’ parameter. I conducted some tests to find the ideal sigma and feature size values. 

        The problem with feature sizes is that due to the random nature of vector selection, sometimes a feature size of size 10 produces more accurate results than a size of 1000. Theoretically, higher the feature size, the better the results should be. After fixing the random.seed() to generate pseudo random features, here are a couple of Feature Sizes vs accuracy comparison curves for different seed values:


As seen, the results are inconclusive, however I’ve found that when the feature size ranges from N/10 to N/100 , the accuracy is most consistent. Similarly here is a Sigma values vs accuracy comparison curve with constant feature sizes:


A sigma value of 0.2 produces the best results. Hence here are the revised RKHS vs RFF vs ProbSpace graphs with best parameters for RFF for 2 different random.seed() values:



And here are the errors and time for execution comparisons:

ProbSpace                time taken : 8.857105493545532      Average Error:  21.206282868505696
RKHS-E(X)             time taken : 2.690704107284546      Average Error:  1.5750775765874951
RFF-100                   time taken : 4.507542610168457      Average Error:  5.297663980507145
RFF(N/10=1000)     time taken : 5.321192026138306      Average Error:  3.308730571215344


As initially expected from the theory, the RFF method produces results and accuracies comparable to the RKHS method. However, it should also take less time to compute as it is working on a smaller dataset when compared to RKHS. I will need to figure out how it can be further optimized as the final goal is to have a choice of two techniques, the RKHS and pseudo-RKHS (RFF) to be used depending on the accuracy and time taken requirements. 

Upcoming Week Plans:

Understand inner workings of the RFF method to improve calculation times, start working on the other objectives like I’ve mentioned before.

Comments

Popular posts from this blog

Week 1: RKHS Kernel Implementation

Week 10: Implementing UPROB

Week 12: Final Week