Feature Extraction
Transformers
Joblib
Safetensors
BulkRNABert
bulk RNA-seq
biology
transcriptomics
custom_code
Instructions to use InstaDeepAI/BulkRNABert with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use InstaDeepAI/BulkRNABert with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="InstaDeepAI/BulkRNABert", trust_remote_code=True)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("InstaDeepAI/BulkRNABert", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Upload BulkRNABert
Browse files- bulkrnabert.py +0 -2
bulkrnabert.py
CHANGED
|
@@ -310,8 +310,6 @@ class BulkRNABert(PreTrainedModel):
|
|
| 310 |
gene_embedding = self.fc_gene_embedding(gene_embedding)
|
| 311 |
x = x + gene_embedding
|
| 312 |
|
| 313 |
-
outs["embeddings"] = x
|
| 314 |
-
|
| 315 |
if attention_mask is None:
|
| 316 |
batch_size, seq_length = input_ids.shape
|
| 317 |
attention_mask = torch.ones( # noqa
|
|
|
|
| 310 |
gene_embedding = self.fc_gene_embedding(gene_embedding)
|
| 311 |
x = x + gene_embedding
|
| 312 |
|
|
|
|
|
|
|
| 313 |
if attention_mask is None:
|
| 314 |
batch_size, seq_length = input_ids.shape
|
| 315 |
attention_mask = torch.ones( # noqa
|