Following are the key points described later in this article:
Following are key steps required to be taken to retrieve users tweets:
Pay attention to some of the following facts:
public class FilterStreamExample {
public static final String CONSUMER_KEY = "Your_consumer_key";
public static final String CONSUMER_SECRET = "Your_consumer_secret";
public static final String ACCESS_TOKEN = "Your_access_token";
public static final String ACCESS_TOKEN_SECRET = "Your_access_token_secret";
public static void process(String consumerKey, String consumerSecret,
String token, String secret) throws InterruptedException {
//
// Create queue which would be used to get message
//
BlockingQueue queue = new LinkedBlockingQueue(10000);
//
// Create an endpoint of type StatusesFilterEndpoint; It has APIs to retrieve
// users tweets or treats related with mention or hashtags
//
StatusesFilterEndpoint endpoint = new StatusesFilterEndpoint();
//
// Add one or more users to follow the tweets;
//
endpoint.followings(Lists.newArrayList( 136976940L));
//
// Create OAuth object using consumer keys/secret and access token/secret
//
Authentication auth = new OAuth1(consumerKey, consumerSecret, token,
secret);
//
// Create a new BasicClient. By default gzip is enabled.
//
Client client = new ClientBuilder().hosts(Constants.STREAM_HOST)
.endpoint(endpoint).authentication(auth)
.processor(new StringDelimitedProcessor(queue)).build();
//
// Establish a connection
//
client.connect();
//
// Code below would extract message as it appears on Twitter
// Do whatever needs to be done with messages; In the code below,
// the message is printed; In real world, the message could be stored
// in Hadoop storage
//
for (int msgRead = 0; msgRead < 1000; msgRead++) {
String msg = queue.take();
System.out.println(msg);
}
client.stop();
}
public static void main(String[] args) {
try {
FilterStreamExample.process(CONSUMER_KEY, CONSUMER_SECRET,
ACCESS_TOKEN, ACCESS_TOKEN_SECRET);
} catch (InterruptedException e) {
System.out.println(e);
}
}
}
Large language models (LLMs) have fundamentally transformed our digital landscape, powering everything from chatbots and…
As Large Language Models (LLMs) evolve into autonomous agents, understanding agentic workflow design patterns has…
In today's data-driven business landscape, organizations are constantly seeking ways to harness the power of…
In this blog, you would get to know the essential mathematical topics you need to…
This blog represents a list of questions you can ask when thinking like a product…
AI agents are autonomous systems combining three core components: a reasoning engine (powered by LLM),…