Skip to content

chiyuzhang94/PMLM-SFT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Accepted by WASSA@ACL-2022:

Trained Models

You can load these models and use for downstream fine-tuning. For example:

from transformers import AutoTokenizer, AutoModelForSequenceClassification

tokenizer = AutoTokenizer.from_pretrained('UBC-NLP/prags1', use_fast = True)
model = AutoModelForSequenceClassification.from_pretrained('UBC-NLP/prags1',num_labels=lable_size)

We evaluate our models on 15 social meaning tasks:

Please cite our paper:

@inproceedings{zhang-abdul-mageed-2022-improving,
    title = "Improving Social Meaning Detection with Pragmatic Masking and Surrogate Fine-Tuning",
    author = "Zhang, Chiyu  and
      Abdul-Mageed, Muhammad",
    booktitle = "Proceedings of the 12th Workshop on Computational Approaches to Subjectivity, Sentiment {\&} Social Media Analysis",
    month = may,
    year = "2022",
    address = "Dublin, Ireland",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2022.wassa-1.14",
    pages = "141--156",

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors