๐Ÿ”Research Agent

Source Annotator

Highlight, annotate, and cross-reference sources without leaving the app

Highlight
key passages
Cross-ref
AI linking
87K+
citations tracked

See it in action

Attention Is All You Need โ€” Vaswani et al., 2017arxiv.org

We propose a new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely. Experiments on two machine translation tasks show these models to be superior in quality...

...the model achieves 28.4 BLEU on the WMT 2014 English-to-German translation task, improving over the existing best results, including ensembles by over 2 BLEU...

๐Ÿ“Œ

Key benchmark claim โ€” cross-reference with Source 3 (d'Ascoli 2021) which disputes generalization on low-resource languages

2017 ยท NeurIPS๐Ÿ“ˆ Cited 87,432 times

Read papers directly in GoalOS and highlight key passages with one tap. Add margin notes, flag claims that need cross-referencing, and copy formatted quotes. The AI automatically links your annotation to related notes in your knowledge base and surfaces conflicting evidence from other sources you've read โ€” so you build a nuanced, well-sourced understanding.

Ready to finish what you start?

Join thousands of people who are finally achieving their goals with AI-powered planning and tracking.

Get Started Free

Free to start. No credit card required.