We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
When running
"dev <- computeDeviations(object = example_counts, annotations = motif_ix)" What does this error mean?
"dev <- computeDeviations(object = example_counts, annotations = motif_ix)"
""Error in counts_check(object) : ncol(object) > 1""
The object is created using
fragment_counts <- getCounts(bamfile, peaks2, paired = TRUE, by_rg = FALSE, format = "bam", colData = DataFrame(celltype = "LNCAP_CR08))
The bamfile is a single .bam file and the peaks is a single narrowpak file read in and sorted using
peaks2 <- getPeaks(peakfile, sort_peaks = TRUE)
$ head(example_counts) class: RangedSummarizedExperiment dim: 6 1 metadata(0): assays(1): counts rownames: NULL rowData names(1): bias colnames(1): LNCAP_CR08.bam colData names(2): celltype depth
I have a single .bam and single peak file. Does this mean I must have replicates? Or does it mean I need a control or comparison peakset?
I can't get past this step ..........
Thanks!
The text was updated successfully, but these errors were encountered:
I had the same problem
Sorry, something went wrong.
Me too ...
Any updates??
No branches or pull requests
When running
"dev <- computeDeviations(object = example_counts, annotations = motif_ix)"
What does this error mean?
""Error in counts_check(object) : ncol(object) > 1""
The object is created using
The bamfile is a single .bam file and the peaks is a single narrowpak file read in and sorted using
peaks2 <- getPeaks(peakfile, sort_peaks = TRUE)
I have a single .bam and single peak file. Does this mean I must have replicates? Or does it mean I need a control or comparison peakset?
I can't get past this step ..........
Thanks!
The text was updated successfully, but these errors were encountered: