You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

1306 lines
37 KiB

Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
  1. // @ts-check
  2. const os = require('os');
  3. const throng = require('throng');
  4. const dotenv = require('dotenv');
  5. const express = require('express');
  6. const http = require('http');
  7. const redis = require('redis');
  8. const pg = require('pg');
  9. const dbUrlToConfig = require('pg-connection-string').parse;
  10. const log = require('npmlog');
  11. const url = require('url');
  12. const uuid = require('uuid');
  13. const fs = require('fs');
  14. const WebSocket = require('ws');
  15. const { JSDOM } = require('jsdom');
  16. const env = process.env.NODE_ENV || 'development';
  17. dotenv.config({
  18. path: env === 'production' ? '.env.production' : '.env',
  19. });
  20. log.level = process.env.LOG_LEVEL || 'verbose';
  21. /**
  22. * @param {Object.<string, any>} defaultConfig
  23. * @param {string} redisUrl
  24. */
  25. const redisUrlToClient = async (defaultConfig, redisUrl) => {
  26. const config = defaultConfig;
  27. let client;
  28. if (!redisUrl) {
  29. client = redis.createClient(config);
  30. } else if (redisUrl.startsWith('unix://')) {
  31. client = redis.createClient(Object.assign(config, {
  32. socket: {
  33. path: redisUrl.slice(7),
  34. },
  35. }));
  36. } else {
  37. client = redis.createClient(Object.assign(config, {
  38. url: redisUrl,
  39. }));
  40. }
  41. client.on('error', (err) => log.error('Redis Client Error!', err));
  42. await client.connect();
  43. return client;
  44. };
  45. const numWorkers = +process.env.STREAMING_CLUSTER_NUM || (env === 'development' ? 1 : Math.max(os.cpus().length - 1, 1));
  46. /**
  47. * @param {string} json
  48. * @param {any} req
  49. * @return {Object.<string, any>|null}
  50. */
  51. const parseJSON = (json, req) => {
  52. try {
  53. return JSON.parse(json);
  54. } catch (err) {
  55. if (req.accountId) {
  56. log.warn(req.requestId, `Error parsing message from user ${req.accountId}: ${err}`);
  57. } else {
  58. log.silly(req.requestId, `Error parsing message from ${req.remoteAddress}: ${err}`);
  59. }
  60. return null;
  61. }
  62. };
  63. const startMaster = () => {
  64. if (!process.env.SOCKET && process.env.PORT && isNaN(+process.env.PORT)) {
  65. log.warn('UNIX domain socket is now supported by using SOCKET. Please migrate from PORT hack.');
  66. }
  67. log.warn(`Starting streaming API server master with ${numWorkers} workers`);
  68. };
  69. /**
  70. * @return {Object.<string, any>}
  71. */
  72. const pgConfigFromEnv = () => {
  73. const pgConfigs = {
  74. development: {
  75. user: process.env.DB_USER || pg.defaults.user,
  76. password: process.env.DB_PASS || pg.defaults.password,
  77. database: process.env.DB_NAME || 'mastodon_development',
  78. host: process.env.DB_HOST || pg.defaults.host,
  79. port: process.env.DB_PORT || pg.defaults.port,
  80. },
  81. production: {
  82. user: process.env.DB_USER || 'mastodon',
  83. password: process.env.DB_PASS || '',
  84. database: process.env.DB_NAME || 'mastodon_production',
  85. host: process.env.DB_HOST || 'localhost',
  86. port: process.env.DB_PORT || 5432,
  87. },
  88. };
  89. let baseConfig;
  90. if (process.env.DATABASE_URL) {
  91. baseConfig = dbUrlToConfig(process.env.DATABASE_URL);
  92. } else {
  93. baseConfig = pgConfigs[env];
  94. if (process.env.DB_SSLMODE) {
  95. switch(process.env.DB_SSLMODE) {
  96. case 'disable':
  97. case '':
  98. baseConfig.ssl = false;
  99. break;
  100. case 'no-verify':
  101. baseConfig.ssl = { rejectUnauthorized: false };
  102. break;
  103. default:
  104. baseConfig.ssl = {};
  105. break;
  106. }
  107. }
  108. }
  109. return {
  110. ...baseConfig,
  111. max: process.env.DB_POOL || 10,
  112. connectionTimeoutMillis: 15000,
  113. application_name: '',
  114. };
  115. };
  116. const startWorker = async (workerId) => {
  117. log.warn(`Starting worker ${workerId}`);
  118. const app = express();
  119. app.set('trust proxy', process.env.TRUSTED_PROXY_IP ? process.env.TRUSTED_PROXY_IP.split(/(?:\s*,\s*|\s+)/) : 'loopback,uniquelocal');
  120. const pgPool = new pg.Pool(pgConfigFromEnv());
  121. const server = http.createServer(app);
  122. const redisNamespace = process.env.REDIS_NAMESPACE || null;
  123. const redisParams = {
  124. socket: {
  125. host: process.env.REDIS_HOST || '127.0.0.1',
  126. port: process.env.REDIS_PORT || 6379,
  127. },
  128. database: process.env.REDIS_DB || 0,
  129. password: process.env.REDIS_PASSWORD || undefined,
  130. };
  131. if (redisNamespace) {
  132. redisParams.namespace = redisNamespace;
  133. }
  134. const redisPrefix = redisNamespace ? `${redisNamespace}:` : '';
  135. /**
  136. * @type {Object.<string, Array.<function(string): void>>}
  137. */
  138. const subs = {};
  139. const redisSubscribeClient = await redisUrlToClient(redisParams, process.env.REDIS_URL);
  140. const redisClient = await redisUrlToClient(redisParams, process.env.REDIS_URL);
  141. /**
  142. * @param {string[]} channels
  143. * @return {function(): void}
  144. */
  145. const subscriptionHeartbeat = channels => {
  146. const interval = 6 * 60;
  147. const tellSubscribed = () => {
  148. channels.forEach(channel => redisClient.set(`${redisPrefix}subscribed:${channel}`, '1', 'EX', interval * 3));
  149. };
  150. tellSubscribed();
  151. const heartbeat = setInterval(tellSubscribed, interval * 1000);
  152. return () => {
  153. clearInterval(heartbeat);
  154. };
  155. };
  156. /**
  157. * @param {string} message
  158. * @param {string} channel
  159. */
  160. const onRedisMessage = (message, channel) => {
  161. const callbacks = subs[channel];
  162. log.silly(`New message on channel ${channel}`);
  163. if (!callbacks) {
  164. return;
  165. }
  166. callbacks.forEach(callback => callback(message));
  167. };
  168. /**
  169. * @param {string} channel
  170. * @param {function(string): void} callback
  171. */
  172. const subscribe = (channel, callback) => {
  173. log.silly(`Adding listener for ${channel}`);
  174. subs[channel] = subs[channel] || [];
  175. if (subs[channel].length === 0) {
  176. log.verbose(`Subscribe ${channel}`);
  177. redisSubscribeClient.subscribe(channel, onRedisMessage);
  178. }
  179. subs[channel].push(callback);
  180. };
  181. /**
  182. * @param {string} channel
  183. */
  184. const unsubscribe = (channel, callback) => {
  185. log.silly(`Removing listener for ${channel}`);
  186. if (!subs[channel]) {
  187. return;
  188. }
  189. subs[channel] = subs[channel].filter(item => item !== callback);
  190. if (subs[channel].length === 0) {
  191. log.verbose(`Unsubscribe ${channel}`);
  192. redisSubscribeClient.unsubscribe(channel);
  193. delete subs[channel];
  194. }
  195. };
  196. const FALSE_VALUES = [
  197. false,
  198. 0,
  199. '0',
  200. 'f',
  201. 'F',
  202. 'false',
  203. 'FALSE',
  204. 'off',
  205. 'OFF',
  206. ];
  207. /**
  208. * @param {any} value
  209. * @return {boolean}
  210. */
  211. const isTruthy = value =>
  212. value && !FALSE_VALUES.includes(value);
  213. /**
  214. * @param {any} req
  215. * @param {any} res
  216. * @param {function(Error=): void}
  217. */
  218. const allowCrossDomain = (req, res, next) => {
  219. res.header('Access-Control-Allow-Origin', '*');
  220. res.header('Access-Control-Allow-Headers', 'Authorization, Accept, Cache-Control');
  221. res.header('Access-Control-Allow-Methods', 'GET, OPTIONS');
  222. next();
  223. };
  224. /**
  225. * @param {any} req
  226. * @param {any} res
  227. * @param {function(Error=): void}
  228. */
  229. const setRequestId = (req, res, next) => {
  230. req.requestId = uuid.v4();
  231. res.header('X-Request-Id', req.requestId);
  232. next();
  233. };
  234. /**
  235. * @param {any} req
  236. * @param {any} res
  237. * @param {function(Error=): void}
  238. */
  239. const setRemoteAddress = (req, res, next) => {
  240. req.remoteAddress = req.connection.remoteAddress;
  241. next();
  242. };
  243. /**
  244. * @param {any} req
  245. * @param {string[]} necessaryScopes
  246. * @return {boolean}
  247. */
  248. const isInScope = (req, necessaryScopes) =>
  249. req.scopes.some(scope => necessaryScopes.includes(scope));
  250. /**
  251. * @param {string} token
  252. * @param {any} req
  253. * @return {Promise.<void>}
  254. */
  255. const accountFromToken = (token, req) => new Promise((resolve, reject) => {
  256. pgPool.connect((err, client, done) => {
  257. if (err) {
  258. reject(err);
  259. return;
  260. }
  261. client.query('SELECT oauth_access_tokens.id, oauth_access_tokens.resource_owner_id, users.account_id, users.chosen_languages, oauth_access_tokens.scopes, devices.device_id FROM oauth_access_tokens INNER JOIN users ON oauth_access_tokens.resource_owner_id = users.id LEFT OUTER JOIN devices ON oauth_access_tokens.id = devices.access_token_id WHERE oauth_access_tokens.token = $1 AND oauth_access_tokens.revoked_at IS NULL LIMIT 1', [token], (err, result) => {
  262. done();
  263. if (err) {
  264. reject(err);
  265. return;
  266. }
  267. if (result.rows.length === 0) {
  268. err = new Error('Invalid access token');
  269. err.status = 401;
  270. reject(err);
  271. return;
  272. }
  273. req.accessTokenId = result.rows[0].id;
  274. req.scopes = result.rows[0].scopes.split(' ');
  275. req.accountId = result.rows[0].account_id;
  276. req.chosenLanguages = result.rows[0].chosen_languages;
  277. req.deviceId = result.rows[0].device_id;
  278. resolve();
  279. });
  280. });
  281. });
  282. /**
  283. * @param {any} req
  284. * @param {boolean=} required
  285. * @return {Promise.<void>}
  286. */
  287. const accountFromRequest = (req) => new Promise((resolve, reject) => {
  288. const authorization = req.headers.authorization;
  289. const location = url.parse(req.url, true);
  290. const accessToken = location.query.access_token || req.headers['sec-websocket-protocol'];
  291. if (!authorization && !accessToken) {
  292. const err = new Error('Missing access token');
  293. err.status = 401;
  294. reject(err);
  295. return;
  296. }
  297. const token = authorization ? authorization.replace(/^Bearer /, '') : accessToken;
  298. resolve(accountFromToken(token, req));
  299. });
  300. /**
  301. * @param {any} req
  302. * @return {string}
  303. */
  304. const channelNameFromPath = req => {
  305. const { path, query } = req;
  306. const onlyMedia = isTruthy(query.only_media);
  307. switch (path) {
  308. case '/api/v1/streaming/user':
  309. return 'user';
  310. case '/api/v1/streaming/user/notification':
  311. return 'user:notification';
  312. case '/api/v1/streaming/public':
  313. return onlyMedia ? 'public:media' : 'public';
  314. case '/api/v1/streaming/public/local':
  315. return onlyMedia ? 'public:local:media' : 'public:local';
  316. case '/api/v1/streaming/public/remote':
  317. return onlyMedia ? 'public:remote:media' : 'public:remote';
  318. case '/api/v1/streaming/hashtag':
  319. return 'hashtag';
  320. case '/api/v1/streaming/hashtag/local':
  321. return 'hashtag:local';
  322. case '/api/v1/streaming/direct':
  323. return 'direct';
  324. case '/api/v1/streaming/list':
  325. return 'list';
  326. default:
  327. return undefined;
  328. }
  329. };
  330. const PUBLIC_CHANNELS = [
  331. 'public',
  332. 'public:media',
  333. 'public:local',
  334. 'public:local:media',
  335. 'public:remote',
  336. 'public:remote:media',
  337. 'hashtag',
  338. 'hashtag:local',
  339. ];
  340. /**
  341. * @param {any} req
  342. * @param {string} channelName
  343. * @return {Promise.<void>}
  344. */
  345. const checkScopes = (req, channelName) => new Promise((resolve, reject) => {
  346. log.silly(req.requestId, `Checking OAuth scopes for ${channelName}`);
  347. // When accessing public channels, no scopes are needed
  348. if (PUBLIC_CHANNELS.includes(channelName)) {
  349. resolve();
  350. return;
  351. }
  352. // The `read` scope has the highest priority, if the token has it
  353. // then it can access all streams
  354. const requiredScopes = ['read'];
  355. // When accessing specifically the notifications stream,
  356. // we need a read:notifications, while in all other cases,
  357. // we can allow access with read:statuses. Mind that the
  358. // user stream will not contain notifications unless
  359. // the token has either read or read:notifications scope
  360. // as well, this is handled separately.
  361. if (channelName === 'user:notification') {
  362. requiredScopes.push('read:notifications');
  363. } else {
  364. requiredScopes.push('read:statuses');
  365. }
  366. if (req.scopes && requiredScopes.some(requiredScope => req.scopes.includes(requiredScope))) {
  367. resolve();
  368. return;
  369. }
  370. const err = new Error('Access token does not cover required scopes');
  371. err.status = 401;
  372. reject(err);
  373. });
  374. /**
  375. * @param {any} info
  376. * @param {function(boolean, number, string): void} callback
  377. */
  378. const wsVerifyClient = (info, callback) => {
  379. // When verifying the websockets connection, we no longer pre-emptively
  380. // check OAuth scopes and drop the connection if they're missing. We only
  381. // drop the connection if access without token is not allowed by environment
  382. // variables. OAuth scope checks are moved to the point of subscription
  383. // to a specific stream.
  384. accountFromRequest(info.req).then(() => {
  385. callback(true, undefined, undefined);
  386. }).catch(err => {
  387. log.error(info.req.requestId, err.toString());
  388. callback(false, 401, 'Unauthorized');
  389. });
  390. };
  391. /**
  392. * @typedef SystemMessageHandlers
  393. * @property {function(): void} onKill
  394. */
  395. /**
  396. * @param {any} req
  397. * @param {SystemMessageHandlers} eventHandlers
  398. * @return {function(string): void}
  399. */
  400. const createSystemMessageListener = (req, eventHandlers) => {
  401. return message => {
  402. const json = parseJSON(message, req);
  403. if (!json) return;
  404. const { event } = json;
  405. log.silly(req.requestId, `System message for ${req.accountId}: ${event}`);
  406. if (event === 'kill') {
  407. log.verbose(req.requestId, `Closing connection for ${req.accountId} due to expired access token`);
  408. eventHandlers.onKill();
  409. } else if (event === 'filters_changed') {
  410. log.verbose(req.requestId, `Invalidating filters cache for ${req.accountId}`);
  411. req.cachedFilters = null;
  412. }
  413. };
  414. };
  415. /**
  416. * @param {any} req
  417. * @param {any} res
  418. */
  419. const subscribeHttpToSystemChannel = (req, res) => {
  420. const accessTokenChannelId = `timeline:access_token:${req.accessTokenId}`;
  421. const systemChannelId = `timeline:system:${req.accountId}`;
  422. const listener = createSystemMessageListener(req, {
  423. onKill() {
  424. res.end();
  425. },
  426. });
  427. res.on('close', () => {
  428. unsubscribe(`${redisPrefix}${accessTokenChannelId}`, listener);
  429. unsubscribe(`${redisPrefix}${systemChannelId}`, listener);
  430. });
  431. subscribe(`${redisPrefix}${accessTokenChannelId}`, listener);
  432. subscribe(`${redisPrefix}${systemChannelId}`, listener);
  433. };
  434. /**
  435. * @param {any} req
  436. * @param {any} res
  437. * @param {function(Error=): void} next
  438. */
  439. const authenticationMiddleware = (req, res, next) => {
  440. if (req.method === 'OPTIONS') {
  441. next();
  442. return;
  443. }
  444. accountFromRequest(req).then(() => checkScopes(req, channelNameFromPath(req))).then(() => {
  445. subscribeHttpToSystemChannel(req, res);
  446. }).then(() => {
  447. next();
  448. }).catch(err => {
  449. next(err);
  450. });
  451. };
  452. /**
  453. * @param {Error} err
  454. * @param {any} req
  455. * @param {any} res
  456. * @param {function(Error=): void} next
  457. */
  458. const errorMiddleware = (err, req, res, next) => {
  459. log.error(req.requestId, err.toString());
  460. if (res.headersSent) {
  461. next(err);
  462. return;
  463. }
  464. res.writeHead(err.status || 500, { 'Content-Type': 'application/json' });
  465. res.end(JSON.stringify({ error: err.status ? err.toString() : 'An unexpected error occurred' }));
  466. };
  467. /**
  468. * @param {array} arr
  469. * @param {number=} shift
  470. * @return {string}
  471. */
  472. const placeholders = (arr, shift = 0) => arr.map((_, i) => `$${i + 1 + shift}`).join(', ');
  473. /**
  474. * @param {string} listId
  475. * @param {any} req
  476. * @return {Promise.<void>}
  477. */
  478. const authorizeListAccess = (listId, req) => new Promise((resolve, reject) => {
  479. const { accountId } = req;
  480. pgPool.connect((err, client, done) => {
  481. if (err) {
  482. reject();
  483. return;
  484. }
  485. client.query('SELECT id, account_id FROM lists WHERE id = $1 LIMIT 1', [listId], (err, result) => {
  486. done();
  487. if (err || result.rows.length === 0 || result.rows[0].account_id !== accountId) {
  488. reject();
  489. return;
  490. }
  491. resolve();
  492. });
  493. });
  494. });
  495. /**
  496. * @param {string[]} ids
  497. * @param {any} req
  498. * @param {function(string, string): void} output
  499. * @param {function(string[], function(string): void): void} attachCloseHandler
  500. * @param {boolean=} needsFiltering
  501. * @return {function(string): void}
  502. */
  503. const streamFrom = (ids, req, output, attachCloseHandler, needsFiltering = false) => {
  504. const accountId = req.accountId || req.remoteAddress;
  505. log.verbose(req.requestId, `Starting stream from ${ids.join(', ')} for ${accountId}`);
  506. const listener = message => {
  507. const json = parseJSON(message, req);
  508. if (!json) return;
  509. const { event, payload, queued_at } = json;
  510. const transmit = () => {
  511. const now = new Date().getTime();
  512. const delta = now - queued_at;
  513. const encodedPayload = typeof payload === 'object' ? JSON.stringify(payload) : payload;
  514. log.silly(req.requestId, `Transmitting for ${accountId}: ${event} ${encodedPayload} Delay: ${delta}ms`);
  515. output(event, encodedPayload);
  516. };
  517. // Only messages that may require filtering are statuses, since notifications
  518. // are already personalized and deletes do not matter
  519. if (!needsFiltering || event !== 'update') {
  520. transmit();
  521. return;
  522. }
  523. const unpackedPayload = payload;
  524. const targetAccountIds = [unpackedPayload.account.id].concat(unpackedPayload.mentions.map(item => item.id));
  525. const accountDomain = unpackedPayload.account.acct.split('@')[1];
  526. if (Array.isArray(req.chosenLanguages) && unpackedPayload.language !== null && req.chosenLanguages.indexOf(unpackedPayload.language) === -1) {
  527. log.silly(req.requestId, `Message ${unpackedPayload.id} filtered by language (${unpackedPayload.language})`);
  528. return;
  529. }
  530. // When the account is not logged in, it is not necessary to confirm the block or mute
  531. if (!req.accountId) {
  532. transmit();
  533. return;
  534. }
  535. pgPool.connect((err, client, done) => {
  536. if (err) {
  537. log.error(err);
  538. return;
  539. }
  540. const queries = [
  541. client.query(`SELECT 1
  542. FROM blocks
  543. WHERE (account_id = $1 AND target_account_id IN (${placeholders(targetAccountIds, 2)}))
  544. OR (account_id = $2 AND target_account_id = $1)
  545. UNION
  546. SELECT 1
  547. FROM mutes
  548. WHERE account_id = $1
  549. AND target_account_id IN (${placeholders(targetAccountIds, 2)})`, [req.accountId, unpackedPayload.account.id].concat(targetAccountIds)),
  550. ];
  551. if (accountDomain) {
  552. queries.push(client.query('SELECT 1 FROM account_domain_blocks WHERE account_id = $1 AND domain = $2', [req.accountId, accountDomain]));
  553. }
  554. if (!unpackedPayload.filtered && !req.cachedFilters) {
  555. queries.push(client.query('SELECT filter.id AS id, filter.phrase AS title, filter.context AS context, filter.expires_at AS expires_at, filter.action AS filter_action, keyword.keyword AS keyword, keyword.whole_word AS whole_word FROM custom_filter_keywords keyword JOIN custom_filters filter ON keyword.custom_filter_id = filter.id WHERE filter.account_id = $1 AND (filter.expires_at IS NULL OR filter.expires_at > NOW())', [req.accountId]));
  556. }
  557. Promise.all(queries).then(values => {
  558. done();
  559. if (values[0].rows.length > 0 || (accountDomain && values[1].rows.length > 0)) {
  560. return;
  561. }
  562. if (!unpackedPayload.filtered && !req.cachedFilters) {
  563. const filterRows = values[accountDomain ? 2 : 1].rows;
  564. req.cachedFilters = filterRows.reduce((cache, row) => {
  565. if (cache[row.id]) {
  566. cache[row.id].keywords.push([row.keyword, row.whole_word]);
  567. } else {
  568. cache[row.id] = {
  569. keywords: [[row.keyword, row.whole_word]],
  570. expires_at: row.expires_at,
  571. repr: {
  572. id: row.id,
  573. title: row.title,
  574. context: row.context,
  575. expires_at: row.expires_at,
  576. filter_action: ['warn', 'hide'][row.filter_action],
  577. },
  578. };
  579. }
  580. return cache;
  581. }, {});
  582. Object.keys(req.cachedFilters).forEach((key) => {
  583. req.cachedFilters[key].regexp = new RegExp(req.cachedFilters[key].keywords.map(([keyword, whole_word]) => {
  584. let expr = keyword.replace(/[.*+?^${}()|[\]\\]/g, '\\$&');
  585. if (whole_word) {
  586. if (/^[\w]/.test(expr)) {
  587. expr = `\\b${expr}`;
  588. }
  589. if (/[\w]$/.test(expr)) {
  590. expr = `${expr}\\b`;
  591. }
  592. }
  593. return expr;
  594. }).join('|'), 'i');
  595. });
  596. }
  597. // Check filters
  598. if (req.cachedFilters && !unpackedPayload.filtered) {
  599. const status = unpackedPayload;
  600. const searchContent = ([status.spoiler_text || '', status.content].concat((status.poll && status.poll.options) ? status.poll.options.map(option => option.title) : [])).concat(status.media_attachments.map(att => att.description)).join('\n\n').replace(/<br\s*\/?>/g, '\n').replace(/<\/p><p>/g, '\n\n');
  601. const searchIndex = JSDOM.fragment(searchContent).textContent;
  602. const now = new Date();
  603. payload.filtered = [];
  604. Object.values(req.cachedFilters).forEach((cachedFilter) => {
  605. if ((cachedFilter.expires_at === null || cachedFilter.expires_at > now)) {
  606. const keyword_matches = searchIndex.match(cachedFilter.regexp);
  607. if (keyword_matches) {
  608. payload.filtered.push({
  609. filter: cachedFilter.repr,
  610. keyword_matches,
  611. });
  612. }
  613. }
  614. });
  615. }
  616. transmit();
  617. }).catch(err => {
  618. log.error(err);
  619. done();
  620. });
  621. });
  622. };
  623. ids.forEach(id => {
  624. subscribe(`${redisPrefix}${id}`, listener);
  625. });
  626. if (attachCloseHandler) {
  627. attachCloseHandler(ids.map(id => `${redisPrefix}${id}`), listener);
  628. }
  629. return listener;
  630. };
  631. /**
  632. * @param {any} req
  633. * @param {any} res
  634. * @return {function(string, string): void}
  635. */
  636. const streamToHttp = (req, res) => {
  637. const accountId = req.accountId || req.remoteAddress;
  638. res.setHeader('Content-Type', 'text/event-stream');
  639. res.setHeader('Cache-Control', 'no-store');
  640. res.setHeader('Transfer-Encoding', 'chunked');
  641. res.write(':)\n');
  642. const heartbeat = setInterval(() => res.write(':thump\n'), 15000);
  643. req.on('close', () => {
  644. log.verbose(req.requestId, `Ending stream for ${accountId}`);
  645. clearInterval(heartbeat);
  646. });
  647. return (event, payload) => {
  648. res.write(`event: ${event}\n`);
  649. res.write(`data: ${payload}\n\n`);
  650. };
  651. };
  652. /**
  653. * @param {any} req
  654. * @param {function(): void} [closeHandler]
  655. * @return {function(string[]): void}
  656. */
  657. const streamHttpEnd = (req, closeHandler = undefined) => (ids) => {
  658. req.on('close', () => {
  659. ids.forEach(id => {
  660. unsubscribe(id);
  661. });
  662. if (closeHandler) {
  663. closeHandler();
  664. }
  665. });
  666. };
  667. /**
  668. * @param {any} req
  669. * @param {any} ws
  670. * @param {string[]} streamName
  671. * @return {function(string, string): void}
  672. */
  673. const streamToWs = (req, ws, streamName) => (event, payload) => {
  674. if (ws.readyState !== ws.OPEN) {
  675. log.error(req.requestId, 'Tried writing to closed socket');
  676. return;
  677. }
  678. ws.send(JSON.stringify({ stream: streamName, event, payload }));
  679. };
  680. /**
  681. * @param {any} res
  682. */
  683. const httpNotFound = res => {
  684. res.writeHead(404, { 'Content-Type': 'application/json' });
  685. res.end(JSON.stringify({ error: 'Not found' }));
  686. };
  687. app.use(setRequestId);
  688. app.use(setRemoteAddress);
  689. app.use(allowCrossDomain);
  690. app.get('/api/v1/streaming/health', (req, res) => {
  691. res.writeHead(200, { 'Content-Type': 'text/plain' });
  692. res.end('OK');
  693. });
  694. app.get('/metrics', (req, res) => server.getConnections((err, count) => {
  695. res.writeHeader(200, { 'Content-Type': 'application/openmetrics-text; version=1.0.0; charset=utf-8' });
  696. res.write('# TYPE connected_clients gauge\n');
  697. res.write('# HELP connected_clients The number of clients connected to the streaming server\n');
  698. res.write(`connected_clients ${count}.0\n`);
  699. res.write('# TYPE connected_channels gauge\n');
  700. res.write('# HELP connected_channels The number of Redis channels the streaming server is subscribed to\n');
  701. res.write(`connected_channels ${Object.keys(subs).length}.0\n`);
  702. res.write('# TYPE pg_pool_total_connections gauge\n');
  703. res.write('# HELP pg_pool_total_connections The total number of clients existing within the pool\n');
  704. res.write(`pg_pool_total_connections ${pgPool.totalCount}.0\n`);
  705. res.write('# TYPE pg_pool_idle_connections gauge\n');
  706. res.write('# HELP pg_pool_idle_connections The number of clients which are not checked out but are currently idle in the pool\n');
  707. res.write(`pg_pool_idle_connections ${pgPool.idleCount}.0\n`);
  708. res.write('# TYPE pg_pool_waiting_queries gauge\n');
  709. res.write('# HELP pg_pool_waiting_queries The number of queued requests waiting on a client when all clients are checked out\n');
  710. res.write(`pg_pool_waiting_queries ${pgPool.waitingCount}.0\n`);
  711. res.write('# EOF\n');
  712. res.end();
  713. }));
  714. app.use(authenticationMiddleware);
  715. app.use(errorMiddleware);
  716. app.get('/api/v1/streaming/*', (req, res) => {
  717. channelNameToIds(req, channelNameFromPath(req), req.query).then(({ channelIds, options }) => {
  718. const onSend = streamToHttp(req, res);
  719. const onEnd = streamHttpEnd(req, subscriptionHeartbeat(channelIds));
  720. streamFrom(channelIds, req, onSend, onEnd, options.needsFiltering);
  721. }).catch(err => {
  722. log.verbose(req.requestId, 'Subscription error:', err.toString());
  723. httpNotFound(res);
  724. });
  725. });
  726. const wss = new WebSocket.Server({ server, verifyClient: wsVerifyClient });
  727. /**
  728. * @typedef StreamParams
  729. * @property {string} [tag]
  730. * @property {string} [list]
  731. * @property {string} [only_media]
  732. */
  733. /**
  734. * @param {any} req
  735. * @return {string[]}
  736. */
  737. const channelsForUserStream = req => {
  738. const arr = [`timeline:${req.accountId}`];
  739. if (isInScope(req, ['crypto']) && req.deviceId) {
  740. arr.push(`timeline:${req.accountId}:${req.deviceId}`);
  741. }
  742. if (isInScope(req, ['read', 'read:notifications'])) {
  743. arr.push(`timeline:${req.accountId}:notifications`);
  744. }
  745. return arr;
  746. };
  747. /**
  748. * See app/lib/ascii_folder.rb for the canon definitions
  749. * of these constants
  750. */
  751. const NON_ASCII_CHARS = 'ÀÁÂÃÄÅàáâãäåĀāĂ㥹ÇçĆćĈĉĊċČčÐðĎďĐđÈÉÊËèéêëĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħÌÍÎÏìíîïĨĩĪīĬĭĮįİıĴĵĶķĸĹĺĻļĽľĿŀŁłÑñŃńŅņŇňʼnŊŋÒÓÔÕÖØòóôõöøŌōŎŏŐőŔŕŖŗŘřŚśŜŝŞşŠšſŢţŤťŦŧÙÚÛÜùúûüŨũŪūŬŭŮůŰűŲųŴŵÝýÿŶŷŸŹźŻżŽž';
  752. const EQUIVALENT_ASCII_CHARS = 'AAAAAAaaaaaaAaAaAaCcCcCcCcCcDdDdDdEEEEeeeeEeEeEeEeEeGgGgGgGgHhHhIIIIiiiiIiIiIiIiIiJjKkkLlLlLlLlLlNnNnNnNnnNnOOOOOOooooooOoOoOoRrRrRrSsSsSsSssTtTtTtUUUUuuuuUuUuUuUuUuUuWwYyyYyYZzZzZz';
  753. /**
  754. * @param {string} str
  755. * @return {string}
  756. */
  757. const foldToASCII = str => {
  758. const regex = new RegExp(NON_ASCII_CHARS.split('').join('|'), 'g');
  759. return str.replace(regex, match => {
  760. const index = NON_ASCII_CHARS.indexOf(match);
  761. return EQUIVALENT_ASCII_CHARS[index];
  762. });
  763. };
  764. /**
  765. * @param {string} str
  766. * @return {string}
  767. */
  768. const normalizeHashtag = str => {
  769. return foldToASCII(str.normalize('NFKC').toLowerCase()).replace(/[^\p{L}\p{N}_\u00b7\u200c]/gu, '');
  770. };
  771. /**
  772. * @param {any} req
  773. * @param {string} name
  774. * @param {StreamParams} params
  775. * @return {Promise.<{ channelIds: string[], options: { needsFiltering: boolean } }>}
  776. */
  777. const channelNameToIds = (req, name, params) => new Promise((resolve, reject) => {
  778. switch (name) {
  779. case 'user':
  780. resolve({
  781. channelIds: channelsForUserStream(req),
  782. options: { needsFiltering: false },
  783. });
  784. break;
  785. case 'user:notification':
  786. resolve({
  787. channelIds: [`timeline:${req.accountId}:notifications`],
  788. options: { needsFiltering: false },
  789. });
  790. break;
  791. case 'public':
  792. resolve({
  793. channelIds: ['timeline:public'],
  794. options: { needsFiltering: true },
  795. });
  796. break;
  797. case 'public:local':
  798. resolve({
  799. channelIds: ['timeline:public:local'],
  800. options: { needsFiltering: true },
  801. });
  802. break;
  803. case 'public:remote':
  804. resolve({
  805. channelIds: ['timeline:public:remote'],
  806. options: { needsFiltering: true },
  807. });
  808. break;
  809. case 'public:media':
  810. resolve({
  811. channelIds: ['timeline:public:media'],
  812. options: { needsFiltering: true },
  813. });
  814. break;
  815. case 'public:local:media':
  816. resolve({
  817. channelIds: ['timeline:public:local:media'],
  818. options: { needsFiltering: true },
  819. });
  820. break;
  821. case 'public:remote:media':
  822. resolve({
  823. channelIds: ['timeline:public:remote:media'],
  824. options: { needsFiltering: true },
  825. });
  826. break;
  827. case 'direct':
  828. resolve({
  829. channelIds: [`timeline:direct:${req.accountId}`],
  830. options: { needsFiltering: false },
  831. });
  832. break;
  833. case 'hashtag':
  834. if (!params.tag || params.tag.length === 0) {
  835. reject('No tag for stream provided');
  836. } else {
  837. resolve({
  838. channelIds: [`timeline:hashtag:${normalizeHashtag(params.tag)}`],
  839. options: { needsFiltering: true },
  840. });
  841. }
  842. break;
  843. case 'hashtag:local':
  844. if (!params.tag || params.tag.length === 0) {
  845. reject('No tag for stream provided');
  846. } else {
  847. resolve({
  848. channelIds: [`timeline:hashtag:${normalizeHashtag(params.tag)}:local`],
  849. options: { needsFiltering: true },
  850. });
  851. }
  852. break;
  853. case 'list':
  854. authorizeListAccess(params.list, req).then(() => {
  855. resolve({
  856. channelIds: [`timeline:list:${params.list}`],
  857. options: { needsFiltering: false },
  858. });
  859. }).catch(() => {
  860. reject('Not authorized to stream this list');
  861. });
  862. break;
  863. default:
  864. reject('Unknown stream type');
  865. }
  866. });
  867. /**
  868. * @param {string} channelName
  869. * @param {StreamParams} params
  870. * @return {string[]}
  871. */
  872. const streamNameFromChannelName = (channelName, params) => {
  873. if (channelName === 'list') {
  874. return [channelName, params.list];
  875. } else if (['hashtag', 'hashtag:local'].includes(channelName)) {
  876. return [channelName, params.tag];
  877. } else {
  878. return [channelName];
  879. }
  880. };
  881. /**
  882. * @typedef WebSocketSession
  883. * @property {any} socket
  884. * @property {any} request
  885. * @property {Object.<string, { listener: function(string): void, stopHeartbeat: function(): void }>} subscriptions
  886. */
  887. /**
  888. * @param {WebSocketSession} session
  889. * @param {string} channelName
  890. * @param {StreamParams} params
  891. */
  892. const subscribeWebsocketToChannel = ({ socket, request, subscriptions }, channelName, params) =>
  893. checkScopes(request, channelName).then(() => channelNameToIds(request, channelName, params)).then(({
  894. channelIds,
  895. options,
  896. }) => {
  897. if (subscriptions[channelIds.join(';')]) {
  898. return;
  899. }
  900. const onSend = streamToWs(request, socket, streamNameFromChannelName(channelName, params));
  901. const stopHeartbeat = subscriptionHeartbeat(channelIds);
  902. const listener = streamFrom(channelIds, request, onSend, undefined, options.needsFiltering);
  903. subscriptions[channelIds.join(';')] = {
  904. listener,
  905. stopHeartbeat,
  906. };
  907. }).catch(err => {
  908. log.verbose(request.requestId, 'Subscription error:', err.toString());
  909. socket.send(JSON.stringify({ error: err.toString() }));
  910. });
  911. /**
  912. * @param {WebSocketSession} session
  913. * @param {string} channelName
  914. * @param {StreamParams} params
  915. */
  916. const unsubscribeWebsocketFromChannel = ({ socket, request, subscriptions }, channelName, params) =>
  917. channelNameToIds(request, channelName, params).then(({ channelIds }) => {
  918. log.verbose(request.requestId, `Ending stream from ${channelIds.join(', ')} for ${request.accountId}`);
  919. const subscription = subscriptions[channelIds.join(';')];
  920. if (!subscription) {
  921. return;
  922. }
  923. const { listener, stopHeartbeat } = subscription;
  924. channelIds.forEach(channelId => {
  925. unsubscribe(`${redisPrefix}${channelId}`, listener);
  926. });
  927. stopHeartbeat();
  928. delete subscriptions[channelIds.join(';')];
  929. }).catch(err => {
  930. log.verbose(request.requestId, 'Unsubscription error:', err);
  931. socket.send(JSON.stringify({ error: err.toString() }));
  932. });
  933. /**
  934. * @param {WebSocketSession} session
  935. */
  936. const subscribeWebsocketToSystemChannel = ({ socket, request, subscriptions }) => {
  937. const accessTokenChannelId = `timeline:access_token:${request.accessTokenId}`;
  938. const systemChannelId = `timeline:system:${request.accountId}`;
  939. const listener = createSystemMessageListener(request, {
  940. onKill() {
  941. socket.close();
  942. },
  943. });
  944. subscribe(`${redisPrefix}${accessTokenChannelId}`, listener);
  945. subscribe(`${redisPrefix}${systemChannelId}`, listener);
  946. subscriptions[accessTokenChannelId] = {
  947. listener,
  948. stopHeartbeat: () => {
  949. },
  950. };
  951. subscriptions[systemChannelId] = {
  952. listener,
  953. stopHeartbeat: () => {
  954. },
  955. };
  956. };
  957. /**
  958. * @param {string|string[]} arrayOrString
  959. * @return {string}
  960. */
  961. const firstParam = arrayOrString => {
  962. if (Array.isArray(arrayOrString)) {
  963. return arrayOrString[0];
  964. } else {
  965. return arrayOrString;
  966. }
  967. };
  968. wss.on('connection', (ws, req) => {
  969. const location = url.parse(req.url, true);
  970. req.requestId = uuid.v4();
  971. req.remoteAddress = ws._socket.remoteAddress;
  972. ws.isAlive = true;
  973. ws.on('pong', () => {
  974. ws.isAlive = true;
  975. });
  976. /**
  977. * @type {WebSocketSession}
  978. */
  979. const session = {
  980. socket: ws,
  981. request: req,
  982. subscriptions: {},
  983. };
  984. const onEnd = () => {
  985. const keys = Object.keys(session.subscriptions);
  986. keys.forEach(channelIds => {
  987. const { listener, stopHeartbeat } = session.subscriptions[channelIds];
  988. channelIds.split(';').forEach(channelId => {
  989. unsubscribe(`${redisPrefix}${channelId}`, listener);
  990. });
  991. stopHeartbeat();
  992. });
  993. };
  994. ws.on('close', onEnd);
  995. ws.on('error', onEnd);
  996. ws.on('message', data => {
  997. const json = parseJSON(data, session.request);
  998. if (!json) return;
  999. const { type, stream, ...params } = json;
  1000. if (type === 'subscribe') {
  1001. subscribeWebsocketToChannel(session, firstParam(stream), params);
  1002. } else if (type === 'unsubscribe') {
  1003. unsubscribeWebsocketFromChannel(session, firstParam(stream), params);
  1004. } else {
  1005. // Unknown action type
  1006. }
  1007. });
  1008. subscribeWebsocketToSystemChannel(session);
  1009. if (location.query.stream) {
  1010. subscribeWebsocketToChannel(session, firstParam(location.query.stream), location.query);
  1011. }
  1012. });
  1013. setInterval(() => {
  1014. wss.clients.forEach(ws => {
  1015. if (ws.isAlive === false) {
  1016. ws.terminate();
  1017. return;
  1018. }
  1019. ws.isAlive = false;
  1020. ws.ping('', false);
  1021. });
  1022. }, 30000);
  1023. attachServerWithConfig(server, address => {
  1024. log.warn(`Worker ${workerId} now listening on ${address}`);
  1025. });
  1026. const onExit = () => {
  1027. log.warn(`Worker ${workerId} exiting`);
  1028. server.close();
  1029. process.exit(0);
  1030. };
  1031. const onError = (err) => {
  1032. log.error(err);
  1033. server.close();
  1034. process.exit(0);
  1035. };
  1036. process.on('SIGINT', onExit);
  1037. process.on('SIGTERM', onExit);
  1038. process.on('exit', onExit);
  1039. process.on('uncaughtException', onError);
  1040. };
  1041. /**
  1042. * @param {any} server
  1043. * @param {function(string): void} [onSuccess]
  1044. */
  1045. const attachServerWithConfig = (server, onSuccess) => {
  1046. if (process.env.SOCKET || process.env.PORT && isNaN(+process.env.PORT)) {
  1047. server.listen(process.env.SOCKET || process.env.PORT, () => {
  1048. if (onSuccess) {
  1049. fs.chmodSync(server.address(), 0o666);
  1050. onSuccess(server.address());
  1051. }
  1052. });
  1053. } else {
  1054. server.listen(+process.env.PORT || 4000, process.env.BIND || '127.0.0.1', () => {
  1055. if (onSuccess) {
  1056. onSuccess(`${server.address().address}:${server.address().port}`);
  1057. }
  1058. });
  1059. }
  1060. };
  1061. /**
  1062. * @param {function(Error=): void} onSuccess
  1063. */
  1064. const onPortAvailable = onSuccess => {
  1065. const testServer = http.createServer();
  1066. testServer.once('error', err => {
  1067. onSuccess(err);
  1068. });
  1069. testServer.once('listening', () => {
  1070. testServer.once('close', () => onSuccess());
  1071. testServer.close();
  1072. });
  1073. attachServerWithConfig(testServer);
  1074. };
  1075. onPortAvailable(err => {
  1076. if (err) {
  1077. log.error('Could not start server, the port or socket is in use');
  1078. return;
  1079. }
  1080. throng({
  1081. workers: numWorkers,
  1082. lifetime: Infinity,
  1083. start: startWorker,
  1084. master: startMaster,
  1085. });
  1086. });