It’s not uncommon to find strange patterns in the data, but NYC 311 data is not always what it seems. Sometimes, the weather really was an outlier for one season, which had an impact on the service requests that came in. Other times strange patterns emerge, it may be due to a program that publicized the ability to make a complaint about a certain thing, which created an inflation of a certain complaint type. Without the additional context, the inflated numbers in the data could be misleading used to support research claims.
For example, the De Blasio administration, has a large emphasis on reducing homelessness. In the new administration, the 311 agency added a “homeless person assistance” request to the mobile app, which spurred a lot of usage -- this request was not previously available through NYC 311.
The city workers that were helping the homeless people started using the mobile app to track their activity which caused a huge uptick in the data. Not long after, the trend levels off. What you can't tell from looking at the data is that the city workers received a new application to use for homelessness tracking. Once they had an application designed for their purpose specifically, they stopped using the 311 mobile app, causing the data to drop back to a "normal" level of assistance requests.
If just looking at the data, you’d think there were an unusual spike in homeless person assistance requests, after which the issues was relieved. Homelessness during this time actually remained near constant, but there’s no way to tell that from looking at the data alone.
This is just one example of how the 311 data may be inflated due to business process change, new administrations, or other political pushes. This isn’t to say that the dataset can’t be relied on to add color to research questions. However, understanding potential limitations can help position research questions for maximum impact.