@vortex_egg There are two techniques in particular I'd like to suggest which ... well, they don't fully work but they seem to help:
1) Time block your information-gathering phase. Whether that's on a daily or weekly ongoing basis, or as a project phase, say "I'll scan Twitter for X minutes per day, only". And do that at the end of the day, when you've taken care of high-relevance/payoff tasks first.
2) What I call #BOTI: "Best of the Interval". On a daily, weekly, monthly, quarterly, annual ... basis, review the items you've selected as noteworthy from that period as well as the top items from the next smaller intervals, and select some n number of best items. You'll probably find that a good value is 10 <= n <= 100, but do what works for you.
BOTI draws on the 43 folders / tickler file concept, or the round-robin database. Essentially you're determining that no matter how long your research goes on, you're committing to a finite set of retained data.
(This is used in all kinds of IT systems and network monitoring, especially with long-term data history.)
You end up with higher resolution in recent / near periods, lower resolution as you go back in time. But you're constantly trying to filter up the best stuff. Since assessment can take time, you'll re-scan earlier selections to see if you'd missed something of relevance (and you can always break protocol for something especially good). But you've got a structure and have set limits on scope.
You'll also start to develop a sense with time as to what actually provides usefulness, and if you track sources, which of those are most valuable. Filter noise aggressively.
A source that sometimes generates signal but usually doesn't ... is virtually always noise. Signal tends to come through, eventually.
(This is related to my "block fuckwits" advice.)
#Zettlekasten #DeepWork #GettingThingsDone #DavidAllen #CalNewport #InformationOverload #LiteratureSearch #ResearchMethods #Research
2/