jmy's repositories
ProLLM
We propose Protein Chain of Thought (ProCoT), which replicates the biological mechanism of signaling pathways as natural language prompts. ProCoT considers a signaling pathway as a protein reasoning process, which starts from upstream proteins and passes through several intermediate proteins to transmit biological signals to downstream proteins.
The-Impact-of-Reasoning-Step-Length-on-Large-Language-Models
[ACL'24] Chain of Thought (CoT) is significant in improving the reasoning abilities of large language models (LLMs). However, the correlation between the effectiveness of CoT and the length of reasoning steps in prompts remains largely unknown. To shed light on this, we have conducted several empirical experiments to explore the relations.
VAEGAN_for_GZSL_based_on_Mahalanobis_distance
VAEGAN, I Love u
We_media_generation
Intelligent Portrait Cropping Based on Heat Map Keyword extraction based on part-of-speech Large model cue engineering based on BERT fine-tune。
Multimode-fusion-method-under-multi-task
A novel fusion approach termed Attentive Tensor Alignment