Giter Club home page Giter Club logo

magiclamp's People

Contributors

arkadiy-garber avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

magiclamp's Issues

Conceptual question about '--norm' flag

Hi again,

This is not an issue, but I have a conceptual question about using (or not using) the --norm flag within the different genies in MagicLamp, and I realize it might be research question dependent/dataset specific but thought I would ask anyways out of curiosity: is it more appropriate to report functional results in terms of raw gene counts, or normalized abundance? I understand that the normalization method within MagicLamp takes into account the number of ORFs in each metagenome, and I am running my data both ways to compare, but I've seen these results reported both ways in the literature (raw counts and normalized), and there does not seem to be a consensus method. I would appreciate any insight!

Thank you,
Rachel

Sources for HMMs for gene hits in LithoGenie

Hi,
I am trying to track down the source of HMMs for each gene hit within my dataset after running LithoGenie, and I was able to find most of them in the supplementary info for this paper: https://www.nature.com/articles/ncomms13219

However, I couldn't find sources for a handful of HMMs, listed here:
lithogenie_hits_unknown_hmm_source.xlsx

I've searched the "hmms" folder on here also and while I found the hmm files, they did not list the accession number- where can I find this information?

Thank you!

Different results of LithoGenie and FeGenie

I Found that LithoGenie can predict more cyc2 gene than FeGenie. Why? Which one I can believe? Moreover, I found the results of FeGenie were very unstable. I can predict a cyc2_cluster3 from a genome when together with other 10 genomes. However, if I only predict from this only one genome, no cyc2 was reported.

Issue with installation

Hi! I was trying to run MagicLamp. However, it shows an error, suggesting conda environment is not there. Please help.

Is it possible to get some

WspGenie error writing heatmap

Hello,
I am running WspGenie, and everything went well except for the very last step of writing the heatmap, which gave this error:

Traceback (most recent call last):
File "/Users/firefly-hl/MagicLamp/MagicLamp.py", line 36, in
WspGenie.main()
File "/Users/firefly-hl/MagicLamp/genies/WspGenie.py", line 1064, in main
outHeat.write(str((len(Dict[j][i]) / int(normDict[j])) * float(100)) + ",")
TypeError: int() argument must be a string, a bytes-like object or a number, not 'collections.defaultdict'

Any advice on how to resolve this?

Thank you!

MagnetoGenie highlights R package not installed in conda environment

Please see the error below. Looks like ggpubr isn't included in the R packages for the conda environment.

(magiclamp) sean-mini-server:2022_PCC_ManuscriptWork sean_server$ MagicLamp.py MagnetoGenie -bin_dir bins -bin_ext faa -out heme_MagnetoGenie -t 4 --orfs --makeplots
checking arguments
.
.
.
All required arguments provided!

starting main pipeline...
analyzing All_proteins_fullannotation_HEMEGENES.faa: 100%   
Identifying genomic proximities and putative operons
Clustering ORFs...

..
...
....
.....
......
.......
Finished!
Running Rscript to generate plots. Do not be alarmed if you see Warning or Error messages from Rscript. This will not affect any of the output data that was already created. If you see plots generated, great! If not, you can plot the data as you wish on your own, or start an issue on MagnetoGenie's GitHub repository

Error in library("ggpubr", lib.loc = library.path) : 
  there is no package called ‘ggpubr’
Execution halted
Error in hclust(d = dist(x = fegenie.scaled)) : 
  must have n >= 2 objects to cluster
Calls: as.dendrogram -> hclust
Execution halted




...
Running Rscript to generate plots. Do not be alarmed if you see Warning or Error messages from Rscript. This will not affect any of the output data that was already created. If you see plots generated, great! If not, you can plot the data as you wish on your own, or start an issue on MagnetoGenie's GitHub repository

Error in library("ggpubr", lib.loc = library.path) : 
  there is no package called ‘ggpubr’
Execution halted
Error in hclust(d = dist(x = fegenie.scaled)) : 
  must have n >= 2 objects to cluster
Calls: as.dendrogram -> hclust
Execution halted

Results are written to heme_MagnetoGenie/magnetogenie-summary.csv and heme_MagnetoGenie/magnetogenie.heatmap.csv
Pipeline finished without crashing!!! Thanks for using :)

Error when running LithoGenie

Error code from run through conda installation. Input was a faa file with ORFs from Anvio.

(magiclamp) sean-mini-server:2022_PCC_ManuscriptWork sean_server$ MagicLamp.py LithoGenie -bin_dir bins -bin_ext faa -out heme_LithoGenie -t 4 --orfs --makeplots --all_results
checking arguments
.
.
.
All required arguments provided!

reading in HMM bitscore cut-offs...
...
starting main pipeline...
analyzing All_proteins_fullannotation_HEMEGENES.faa: 99%
Identifying genomic proximities and putative operons
Traceback (most recent call last):
File "/Users/sean_server/software/MagicLamp/MagicLamp.py", line 30, in
LithoGenie.main()
File "/Users/sean_server/software/MagicLamp/genies/LithoGenie.py", line 954, in main
CoordDict[i][contig].append(int(numOrf))
ValueError: invalid literal for int() with base 10: 'JV1'

BVCN-examples in Issue: Tutorial 2: HMM development and calibration, HmmGenie

All required arguments provided!

Finding ORFs for Acidithiobacillus_ferrooxidans.txt
Finding ORFs for Mariprofundus_ferrooxidans_PV-1.txt
Finding ORFs for Rhodopseudomonas_palustris_TIE-1.txt
Finding ORFs for Shewanella_oneidensis_MR-1.txt
starting main pipeline...
Traceback (most recent call last):xidans.txt: 25%
File "/MagicLamp/MagicLamp.py", line 46, in
HmmGenie.main()
File "
/MagicLamp/genies/HmmGenie.py", line 788, in main
% (metaDict[hmm]["evalue"], int(args.t), outDirectory, i, hmm, outDirectory, i, hmm, args.hmm_dir, hmm, outDirectory, i))
UnboundLocalError: local variable 'metaDict' referenced before assignment

Error finding *.depth file

Hello,

I am trying to use LithoGenie to functionally annotate my MAGs.
The programme is working fine until I use the -bams option to normalise taking into account the coverage from read mapping (My MAGs were generated from a co-assembly and I made a bam.txt following your instructions for co-assembled genomes).

I keep getting this error:


processing... maxbin.031_sub.contigs.fa
Output depth matrix to maxbin.031_sub.contigs.fa.depth
jgi_summarize_bam_contig_depths 2.15 (Bioconda) 2020-07-03T13:02:15
Output matrix to maxbin.031_sub.contigs.fa.depth
0: Opening bam: /nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/bin_coverage/BT31.bam
12: Opening bam: 11: Opening bam: /nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/bin_coverage/BT52.bam56: Opening bam: /nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/bin_coverage/BT42.bam
/nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/bin_coverage/BT53.bam

8: Opening bam: /nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/bin_coverage/BT44.bam
: Opening bam: /nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/bin_coverage/BT41.bam
13: Opening bam: /nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/bin_coverage/BT54.bam
7: Opening bam: /nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/bin_coverage/BT43.bam
3: Opening bam: /nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/bin_coverage/BT34.bam
2: Opening bam: /nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/bin_coverage/BT33.bam
14: Opening bam: /nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/bin_coverage/BT55.bam
1: Opening bam: /nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/bin_coverage/BT32.bam
4: Opening bam: /nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/bin_coverage/BT35.bam
109: Opening bam: /nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/bin_coverage/BT51.bam
: Opening bam: /nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/bin_coverage/BT45.bam
Processing bam files
Thread 0 finished: BT31.bam with 41784108 reads and 2548900 readsWellMapped
Thread 13 finished: BT54.bam with 37818784 reads and 1818920 readsWellMapped
Thread 8 finished: BT44.bam with 47649368 reads and 8359982 readsWellMapped
Thread 12 finished: BT53.bam with 42732468 reads and 3793898 readsWellMapped
Thread 6 finished: BT42.bam with 41398298 reads and 4262408 readsWellMapped
Thread 14 finished: BT55.bam with 14976816 reads and 801879 readsWellMapped
Thread 7 finished: BT43.bam with 76231146 reads and 12109136 readsWellMapped
Thread 10 finished: BT51.bam with 20372246 reads and 805082 readsWellMapped
Thread 5 finished: BT41.bam with 52802668 reads and 3544574 readsWellMapped
Thread 11 finished: BT52.bam with 28965720 reads and 1986958 readsWellMapped
Thread 4 finished: BT35.bam with 34227688 reads and 5284176 readsWellMapped
Thread 2 finished: BT33.bam with 21982532 reads and 1772437 readsWellMapped
Thread 3 finished: BT34.bam with 33913086 reads and 3600508 readsWellMapped
Thread 1 finished: BT32.bam with 61539122 reads and 4501818 readsWellMapped
Thread 9 finished: BT45.bam with 77754398 reads and 11829835 readsWellMapped
Creating depth matrix file: maxbin.031_sub.contigs.fa.depth
Closing most bam files
Closing last bam file
Finished
processing... maxbin.031_sub.contigs.fa
Traceback (most recent call last):
File "/scale_wlg_nobackup/filesets/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/1.Prodig_anno_renamed/MagicLamp/genies/LithoGenie.py", line 2253, in main
depth = open("%s/contigDepths/%s.depth" % (args.out, cell))
FileNotFoundError: [Errno 2] No such file or directory: 'lithogenie_output_bam/contigDepths/maxbin.031_sub.contigs.fa.depth'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/1.Prodig_anno_renamed/MagicLamp/MagicLamp.py", line 30, in
LithoGenie.main()
File "/scale_wlg_nobackup/filesets/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/1.Prodig_anno_renamed/MagicLamp/genies/LithoGenie.py", line 2265, in main
depth = open("%s/contigDepths/%s.depth" % (args.out, cell))
FileNotFoundError: [Errno 2] No such file or directory: 'lithogenie_output_bam/contigDepths/maxbin.031_sub.contigs.fa.depth'


I have noticed that I generated the file "maxbin.031_sub.contigs.fa.depth", however it is not inside the output_directory.

This is the script I used:

MagicLamp.py LithoGenie -bin_dir /nesi/nobackup/nesi00465/raw_data/Chapter_2_version_Sep2/Chapter_2/MAGs/4.Qual_bins/1.Prodig_anno_renamed/MagicLamp/lithogenie_output/ORF_calls -bin_ext faa -out lithogenie_output_bam -bams bams3.txt --orfs

Cheers,

Maria

Potential setup issue

Hey @Arkadiy-Garber

I was trying to re-setup MagicLamp and noticed the following potential issue. In the setup.sh script which looks like so:

conda create -n fegenie -c r r-ggplot2 r-stringi r-reshape r-reshape2 r-tidyverse r-argparse r-ggdendro r-pvclust python=3.7 hmmer diamond prodigal blast metabat2 --yes

## activating environment
source activate magiccave

Line #11 creates a conda environment called "fegenie". However, lines #18 and 49 are activating an environment called "magiccave"

ERROR

# To activate this environment, use
#
#     $ conda activate fegenie
#
# To deactivate an active environment, use
#
#     $ conda deactivate

setup.sh: line 13: Rscript: command not found
setup.sh: line 14: Rscript: command not found
setup.sh: line 15: Rscript: command not found
setup.sh: line 18: activate: No such file or directory
mkdir: cannot create directory ‘/etc/conda’: Permission denied
setup.sh: line 46: /etc/conda/activate.d/env_vars.sh: No such file or directory
setup.sh: line 49: activate: No such file or directory

        DONE!

Thanks again for a great pipeline!

missing import time in FeGenie

Hi,
I got this error when running the FeGenie pipeline

Traceback (most recent call last):
  File "/home/gmichoud/MagicLamp/MagicLamp.py", line 28, in <module>
    FeGenie.main()
  File "/home/gmichoud/MagicLamp/genies/FeGenie.py", line 843, in main
    time.sleep(5)
NameError: name 'time' is not defined

Adding an import time in the FeGenie.py script solved the issue
Best
Greg

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.