Conventional wisdom is that Congress doesn’t support the trade war. Sentiment analysis of every member’s Twitter feeds paints a different picture.
TL;DR: Congress is about as negative on China as the President is. And their followers are rewarding negative China tweets with more likes and retweets. With Congress providing tacit support, the public behind them, a stock market near all-time highs, and steady approval ratings expect the President to remain tough on China.
Using the Twitter API and Tweepy library, I scraped the most recent 3,200 tweets from every member of the 116th U.S. Congress. I selected tweets that contained the words China, Hong Kong, President Xi, Xi Jinping, Beijing, or Chinese. From this subset, I filtered out any tweets containing Trump, Administration, or President, in an imperfect attempt to eliminate tweets directed at the President. I fed the resulting 4,099 tweet dataset through an enhanced version of my previously built China sentiment model. One sample of the code for scraping these Tweets can be found here.
67% of China Tweets by Congress Had Negative Sentiment
This is in line with the President’s negative tweeting rate on China post-election.
Senators tweet about the topic 3 times as much as their House colleagues. Republican Senators lead the way at 25 tweets per member. But, when it comes to negative sentiment:
Senate Democrats Led the Way with 78% Negative Tweets
Full code for this post can be found on my GitHub here.
Twitter Followers Seem to Agree
The Twitter following public is rewarding representatives’ tough talk on China and trade.
Across the board, all negative sentiment tweets receive more Likes.
The picture with Retweets is more mixed; but, the partisan divide is starker.
Higher Democratic like and retweet rates suggest other forces at work. These include a passion for the 2020 election and concern around human right in Hong Kong.
The words they use also veer toward negativity. A word cloud of the most common words in the original dataset surfaces many negative words:
I continue to use Transfer Learning using the ULMFIT implementation in fast.ai. I re-ran the Trump-China sentiment model using new optimization functions released this summer. For more detail on the core building blocks for an NLP model using ULMFIT and optimizer functions, read my other August posts here and here.
I combined two recently proposed optimizers, RAdam and LookAhead, using a codebase Ranger developed by Less Wright.
Some background on LookAhead, from Less’s post. Popular optimizers like Adam rely on adaptive momentum (speeding up learning using moving averages of the gradient and prior direction) to improve on Stochastic Gradient Descent. LookAhead takes a completely different approach. As Wright states:
LookAhead however, is a new development that maintains two sets of weights and then interpolates between them — in effect it allows a faster set of weights to ‘look ahead’ or explore while the slower weights stay behind to provide longer term stability.”Source
Ranger has RAdam produce the faster set of weights, while it takes the weights at each step and holds on to them. At certain interval, LookAhead functionality takes the difference between the two sets of weights and multiples them by a defined parameter, 0.5.
Implementation of Ranger had immediate results, raising the accuracy on my Trump-China sentiment classification model from 85% to 91%!
Any opinions or forecasts contained herein reflect the personal and subjective judgments and assumptions of the author only. There can be no assurance that developments will transpire as forecasted and actual results will be different. The accuracy of data is not guaranteed but represents the author’s best judgment and can be derived from a variety of sources. The information is subject to change at any time without notice.