The relationship between random variables is always a crucial part in data analysis. Fundamentally, we are interested in testing whether two variables are independent or not and, if not, how do they relate to each other. Different measurements, independence tests, regression models and methods are designed to answer the question. In the first part of this thesis, we focus on the mutual information (MI) that has been widely applied for quantifying the dependence. We propose a Jackknife approach to estimate MI and establish its theoretical underpinnings. The proposed method possesses several desirable theoretical merit and presents superior performance in simulation studies. In the second part, we pay attention to the RKHS method, which is arguably the most popular approach for dealing with nonlinearity in data. We introduce a symmetric periodic Gaussian kernel and yield asymptotic normality for the method under generic regression setting. The theoretical results and the simulation study collectively shed some light on the success of the Gaussian reproducing kernel.