Notes for contributing
Contributions are always welcome, as are feature requests and suggestions. Feel free to open issues and pull requests at any time. If you aren't familiar with git or Github please start now.
Setting workspace up
Fork ModMashup.jl repository first.
Julia's internal package manager makes it easy to install and modify packages from Github. Any package hosted on Github can be installed via Pkg.clone by providing the repository's URL, so installing a fork on your system is a simple task.
Remember to replace https://github.com/memoiry/ModMashup.jl with your fork of ModMashup.jl
$ julia
Pkg.clone("https://github.com/memoiry/ModMashup.jl")
Make a test.
Pkg.test("ModMashup")
Everything should be working now, you can find package location in your computer by.
$ julia -e "println(Pkg.dir("ModMashup"))"
Simply make a change in that folder.
Network integration
For those who want to continue develop mashup network integration algorithm, the only function you need to modify is network_integration!.
Modified mashup algorithm for network integration
The implementation of modified mashup algorithm for network integration is still under developing and is not fully verified its effectiveness.
The current mashup integration algorithm works as below.
Random walk with restarts: For each similarity network Ai, we run random walk to obtain Qi.
Smoothing: For each Qi, we select smooth or not to smooth. Smooth means Ri = log(abs(Q) + 1/n_patients).
Concat each Qi (not smooth) or Ri (smooth) along row axis to get a big matrix N.
Run SVD. U,S,V = svd(N) to get U, S, V
Let H = U * S^(1/2).
V = S^(1/2) * V.
Linear regression to get Beta = V’ \ annotation ( left divide, annotation is a binary vector, 1 indicates query type, -1 indicates not-query type).
Getting network weights through cross-validation
For each class cl in classes
Net_weights = matrix[N networks, 10 CV]
For k in 1:10 # 10-fold CV
Qry_k = 90% training samples of class cl
For network j
# indices corresponding to network j and
# within that, samples in Qry_k
H_cur = H[(j,Qry_k),]
All_weights = corr(H_cur,beta)
# Or we can get network weight by dot product
# All_weights = H_cur * beta
Net_weights[j,k] = mean(All_weights)
End
End
End
Reference:
Compact Integration of Multi-Network Topology for Functional Analysis of Genes, pages 13.
Label propagation
For those who want to continue develop label propagation algorithm, the only function you need to modify is label_propagation!