You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

1328 lines
38 KiB

Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
Revamp post filtering system (#18058) * Add model for custom filter keywords * Use CustomFilterKeyword internally Does not change the API * Fix /filters/edit and /filters/new * Add migration tests * Remove whole_word column from custom_filters (covered by custom_filter_keywords) * Redesign /filters Instead of a list, present a card that displays more information and handles multiple keywords per filter. * Redesign /filters/new and /filters/edit to add and remove keywords This adds a new gem dependency: cocoon, as well as a npm dependency: cocoon-js-vanilla. Those are used to easily populate and remove form fields from the user interface when manipulating multiple keyword filters at once. * Add /api/v2/filters to edit filter with multiple keywords Entities: - `Filter`: `id`, `title`, `filter_action` (either `hide` or `warn`), `context` `keywords` - `FilterKeyword`: `id`, `keyword`, `whole_word` API endpoits: - `GET /api/v2/filters` to list filters (including keywords) - `POST /api/v2/filters` to create a new filter `keywords_attributes` can also be passed to create keywords in one request - `GET /api/v2/filters/:id` to read a particular filter - `PUT /api/v2/filters/:id` to update a new filter `keywords_attributes` can also be passed to edit, delete or add keywords in one request - `DELETE /api/v2/filters/:id` to delete a particular filter - `GET /api/v2/filters/:id/keywords` to list keywords for a filter - `POST /api/v2/filters/:filter_id/keywords/:id` to add a new keyword to a filter - `GET /api/v2/filter_keywords/:id` to read a particular keyword - `PUT /api/v2/filter_keywords/:id` to edit a particular keyword - `DELETE /api/v2/filter_keywords/:id` to delete a particular keyword * Change from `irreversible` boolean to `action` enum * Remove irrelevent `irreversible_must_be_within_context` check * Fix /filters/new and /filters/edit with update for filter_action * Fix Rubocop/Codeclimate complaining about task names * Refactor FeedManager#phrase_filtered? This moves regexp building and filter caching to the `CustomFilter` class. This does not change the functional behavior yet, but this changes how the cache is built, doing per-custom_filter regexps so that filters can be matched independently, while still offering caching. * Perform server-side filtering and output result in REST API * Fix numerous filters_changed events being sent when editing multiple keywords at once * Add some tests * Use the new API in the WebUI - use client-side logic for filters we have fetched rules for. This is so that filter changes can be retroactively applied without reloading the UI. - use server-side logic for filters we haven't fetched rules for yet (e.g. network error, or initial timeline loading) * Minor optimizations and refactoring * Perform server-side filtering on the streaming server * Change the wording of filter action labels * Fix issues pointed out by linter * Change design of “Show anyway” link in accordence to review comments * Drop “irreversible” filtering behavior * Move /api/v2/filter_keywords to /api/v1/filters/keywords * Rename `filter_results` attribute to `filtered` * Rename REST::LegacyFilterSerializer to REST::V1::FilterSerializer * Fix systemChannelId value in streaming server * Simplify code by removing client-side filtering code The simplifcation comes at a cost though: filters aren't retroactively applied anymore.
1 year ago
  1. // @ts-check
  2. const os = require('os');
  3. const throng = require('throng');
  4. const dotenv = require('dotenv');
  5. const express = require('express');
  6. const http = require('http');
  7. const redis = require('redis');
  8. const pg = require('pg');
  9. const dbUrlToConfig = require('pg-connection-string').parse;
  10. const log = require('npmlog');
  11. const url = require('url');
  12. const uuid = require('uuid');
  13. const fs = require('fs');
  14. const WebSocket = require('ws');
  15. const { JSDOM } = require('jsdom');
  16. const env = process.env.NODE_ENV || 'development';
  17. dotenv.config({
  18. path: env === 'production' ? '.env.production' : '.env',
  19. });
  20. log.level = process.env.LOG_LEVEL || 'verbose';
  21. /**
  22. * @param {Object.<string, any>} defaultConfig
  23. * @param {string} redisUrl
  24. */
  25. const redisUrlToClient = async (defaultConfig, redisUrl) => {
  26. const config = defaultConfig;
  27. let client;
  28. if (!redisUrl) {
  29. client = redis.createClient(config);
  30. } else if (redisUrl.startsWith('unix://')) {
  31. client = redis.createClient(Object.assign(config, {
  32. socket: {
  33. path: redisUrl.slice(7),
  34. },
  35. }));
  36. } else {
  37. client = redis.createClient(Object.assign(config, {
  38. url: redisUrl,
  39. }));
  40. }
  41. client.on('error', (err) => log.error('Redis Client Error!', err));
  42. await client.connect();
  43. return client;
  44. };
  45. const numWorkers = +process.env.STREAMING_CLUSTER_NUM || (env === 'development' ? 1 : Math.max(os.cpus().length - 1, 1));
  46. /**
  47. * @param {string} json
  48. * @param {any} req
  49. * @return {Object.<string, any>|null}
  50. */
  51. const parseJSON = (json, req) => {
  52. try {
  53. return JSON.parse(json);
  54. } catch (err) {
  55. if (req.accountId) {
  56. log.warn(req.requestId, `Error parsing message from user ${req.accountId}: ${err}`);
  57. } else {
  58. log.silly(req.requestId, `Error parsing message from ${req.remoteAddress}: ${err}`);
  59. }
  60. return null;
  61. }
  62. };
  63. const startMaster = () => {
  64. if (!process.env.SOCKET && process.env.PORT && isNaN(+process.env.PORT)) {
  65. log.warn('UNIX domain socket is now supported by using SOCKET. Please migrate from PORT hack.');
  66. }
  67. log.warn(`Starting streaming API server master with ${numWorkers} workers`);
  68. };
  69. /**
  70. * @return {Object.<string, any>}
  71. */
  72. const pgConfigFromEnv = () => {
  73. const pgConfigs = {
  74. development: {
  75. user: process.env.DB_USER || pg.defaults.user,
  76. password: process.env.DB_PASS || pg.defaults.password,
  77. database: process.env.DB_NAME || 'mastodon_development',
  78. host: process.env.DB_HOST || pg.defaults.host,
  79. port: process.env.DB_PORT || pg.defaults.port,
  80. },
  81. production: {
  82. user: process.env.DB_USER || 'mastodon',
  83. password: process.env.DB_PASS || '',
  84. database: process.env.DB_NAME || 'mastodon_production',
  85. host: process.env.DB_HOST || 'localhost',
  86. port: process.env.DB_PORT || 5432,
  87. },
  88. };
  89. let baseConfig;
  90. if (process.env.DATABASE_URL) {
  91. baseConfig = dbUrlToConfig(process.env.DATABASE_URL);
  92. } else {
  93. baseConfig = pgConfigs[env];
  94. if (process.env.DB_SSLMODE) {
  95. switch(process.env.DB_SSLMODE) {
  96. case 'disable':
  97. case '':
  98. baseConfig.ssl = false;
  99. break;
  100. case 'no-verify':
  101. baseConfig.ssl = { rejectUnauthorized: false };
  102. break;
  103. default:
  104. baseConfig.ssl = {};
  105. break;
  106. }
  107. }
  108. }
  109. return {
  110. ...baseConfig,
  111. max: process.env.DB_POOL || 10,
  112. connectionTimeoutMillis: 15000,
  113. application_name: '',
  114. };
  115. };
  116. const startWorker = async (workerId) => {
  117. log.warn(`Starting worker ${workerId}`);
  118. const app = express();
  119. app.set('trust proxy', process.env.TRUSTED_PROXY_IP ? process.env.TRUSTED_PROXY_IP.split(/(?:\s*,\s*|\s+)/) : 'loopback,uniquelocal');
  120. const pgPool = new pg.Pool(pgConfigFromEnv());
  121. const server = http.createServer(app);
  122. const redisNamespace = process.env.REDIS_NAMESPACE || null;
  123. const redisParams = {
  124. socket: {
  125. host: process.env.REDIS_HOST || '127.0.0.1',
  126. port: process.env.REDIS_PORT || 6379,
  127. },
  128. database: process.env.REDIS_DB || 0,
  129. password: process.env.REDIS_PASSWORD || undefined,
  130. };
  131. if (redisNamespace) {
  132. redisParams.namespace = redisNamespace;
  133. }
  134. const redisPrefix = redisNamespace ? `${redisNamespace}:` : '';
  135. /**
  136. * @type {Object.<string, Array.<function(string): void>>}
  137. */
  138. const subs = {};
  139. const redisSubscribeClient = await redisUrlToClient(redisParams, process.env.REDIS_URL);
  140. const redisClient = await redisUrlToClient(redisParams, process.env.REDIS_URL);
  141. /**
  142. * @param {string[]} channels
  143. * @return {function(): void}
  144. */
  145. const subscriptionHeartbeat = channels => {
  146. const interval = 6 * 60;
  147. const tellSubscribed = () => {
  148. channels.forEach(channel => redisClient.set(`${redisPrefix}subscribed:${channel}`, '1', 'EX', interval * 3));
  149. };
  150. tellSubscribed();
  151. const heartbeat = setInterval(tellSubscribed, interval * 1000);
  152. return () => {
  153. clearInterval(heartbeat);
  154. };
  155. };
  156. /**
  157. * @param {string} message
  158. * @param {string} channel
  159. */
  160. const onRedisMessage = (message, channel) => {
  161. const callbacks = subs[channel];
  162. log.silly(`New message on channel ${channel}`);
  163. if (!callbacks) {
  164. return;
  165. }
  166. callbacks.forEach(callback => callback(message));
  167. };
  168. /**
  169. * @param {string} channel
  170. * @param {function(string): void} callback
  171. */
  172. const subscribe = (channel, callback) => {
  173. log.silly(`Adding listener for ${channel}`);
  174. subs[channel] = subs[channel] || [];
  175. if (subs[channel].length === 0) {
  176. log.verbose(`Subscribe ${channel}`);
  177. redisSubscribeClient.subscribe(channel, onRedisMessage);
  178. }
  179. subs[channel].push(callback);
  180. };
  181. /**
  182. * @param {string} channel
  183. */
  184. const unsubscribe = (channel, callback) => {
  185. log.silly(`Removing listener for ${channel}`);
  186. if (!subs[channel]) {
  187. return;
  188. }
  189. subs[channel] = subs[channel].filter(item => item !== callback);
  190. if (subs[channel].length === 0) {
  191. log.verbose(`Unsubscribe ${channel}`);
  192. redisSubscribeClient.unsubscribe(channel);
  193. delete subs[channel];
  194. }
  195. };
  196. const FALSE_VALUES = [
  197. false,
  198. 0,
  199. '0',
  200. 'f',
  201. 'F',
  202. 'false',
  203. 'FALSE',
  204. 'off',
  205. 'OFF',
  206. ];
  207. /**
  208. * @param {any} value
  209. * @return {boolean}
  210. */
  211. const isTruthy = value =>
  212. value && !FALSE_VALUES.includes(value);
  213. /**
  214. * @param {any} req
  215. * @param {any} res
  216. * @param {function(Error=): void}
  217. */
  218. const allowCrossDomain = (req, res, next) => {
  219. res.header('Access-Control-Allow-Origin', '*');
  220. res.header('Access-Control-Allow-Headers', 'Authorization, Accept, Cache-Control');
  221. res.header('Access-Control-Allow-Methods', 'GET, OPTIONS');
  222. next();
  223. };
  224. /**
  225. * @param {any} req
  226. * @param {any} res
  227. * @param {function(Error=): void}
  228. */
  229. const setRequestId = (req, res, next) => {
  230. req.requestId = uuid.v4();
  231. res.header('X-Request-Id', req.requestId);
  232. next();
  233. };
  234. /**
  235. * @param {any} req
  236. * @param {any} res
  237. * @param {function(Error=): void}
  238. */
  239. const setRemoteAddress = (req, res, next) => {
  240. req.remoteAddress = req.connection.remoteAddress;
  241. next();
  242. };
  243. /**
  244. * @param {any} req
  245. * @param {string[]} necessaryScopes
  246. * @return {boolean}
  247. */
  248. const isInScope = (req, necessaryScopes) =>
  249. req.scopes.some(scope => necessaryScopes.includes(scope));
  250. /**
  251. * @param {string} token
  252. * @param {any} req
  253. * @return {Promise.<void>}
  254. */
  255. const accountFromToken = (token, req) => new Promise((resolve, reject) => {
  256. pgPool.connect((err, client, done) => {
  257. if (err) {
  258. reject(err);
  259. return;
  260. }
  261. client.query('SELECT oauth_access_tokens.id, oauth_access_tokens.resource_owner_id, users.account_id, users.chosen_languages, oauth_access_tokens.scopes, devices.device_id FROM oauth_access_tokens INNER JOIN users ON oauth_access_tokens.resource_owner_id = users.id LEFT OUTER JOIN devices ON oauth_access_tokens.id = devices.access_token_id WHERE oauth_access_tokens.token = $1 AND oauth_access_tokens.revoked_at IS NULL LIMIT 1', [token], (err, result) => {
  262. done();
  263. if (err) {
  264. reject(err);
  265. return;
  266. }
  267. if (result.rows.length === 0) {
  268. err = new Error('Invalid access token');
  269. err.status = 401;
  270. reject(err);
  271. return;
  272. }
  273. req.accessTokenId = result.rows[0].id;
  274. req.scopes = result.rows[0].scopes.split(' ');
  275. req.accountId = result.rows[0].account_id;
  276. req.chosenLanguages = result.rows[0].chosen_languages;
  277. req.deviceId = result.rows[0].device_id;
  278. resolve();
  279. });
  280. });
  281. });
  282. /**
  283. * @param {any} req
  284. * @param {boolean=} required
  285. * @return {Promise.<void>}
  286. */
  287. const accountFromRequest = (req) => new Promise((resolve, reject) => {
  288. const authorization = req.headers.authorization;
  289. const location = url.parse(req.url, true);
  290. const accessToken = location.query.access_token || req.headers['sec-websocket-protocol'];
  291. if (!authorization && !accessToken) {
  292. const err = new Error('Missing access token');
  293. err.status = 401;
  294. reject(err);
  295. return;
  296. }
  297. const token = authorization ? authorization.replace(/^Bearer /, '') : accessToken;
  298. resolve(accountFromToken(token, req));
  299. });
  300. /**
  301. * @param {any} req
  302. * @return {string}
  303. */
  304. const channelNameFromPath = req => {
  305. const { path, query } = req;
  306. const onlyMedia = isTruthy(query.only_media);
  307. const allowLocalOnly = isTruthy(query.allow_local_only);
  308. switch (path) {
  309. case '/api/v1/streaming/user':
  310. return 'user';
  311. case '/api/v1/streaming/user/notification':
  312. return 'user:notification';
  313. case '/api/v1/streaming/public':
  314. return onlyMedia ? 'public:media' : 'public';
  315. case '/api/v1/streaming/public/local':
  316. return onlyMedia ? 'public:local:media' : 'public:local';
  317. case '/api/v1/streaming/public/remote':
  318. return onlyMedia ? 'public:remote:media' : 'public:remote';
  319. case '/api/v1/streaming/hashtag':
  320. return 'hashtag';
  321. case '/api/v1/streaming/hashtag/local':
  322. return 'hashtag:local';
  323. case '/api/v1/streaming/direct':
  324. return 'direct';
  325. case '/api/v1/streaming/list':
  326. return 'list';
  327. default:
  328. return undefined;
  329. }
  330. };
  331. const PUBLIC_CHANNELS = [
  332. 'public',
  333. 'public:media',
  334. 'public:local',
  335. 'public:local:media',
  336. 'public:remote',
  337. 'public:remote:media',
  338. 'hashtag',
  339. 'hashtag:local',
  340. ];
  341. /**
  342. * @param {any} req
  343. * @param {string} channelName
  344. * @return {Promise.<void>}
  345. */
  346. const checkScopes = (req, channelName) => new Promise((resolve, reject) => {
  347. log.silly(req.requestId, `Checking OAuth scopes for ${channelName}`);
  348. // When accessing public channels, no scopes are needed
  349. if (PUBLIC_CHANNELS.includes(channelName)) {
  350. resolve();
  351. return;
  352. }
  353. // The `read` scope has the highest priority, if the token has it
  354. // then it can access all streams
  355. const requiredScopes = ['read'];
  356. // When accessing specifically the notifications stream,
  357. // we need a read:notifications, while in all other cases,
  358. // we can allow access with read:statuses. Mind that the
  359. // user stream will not contain notifications unless
  360. // the token has either read or read:notifications scope
  361. // as well, this is handled separately.
  362. if (channelName === 'user:notification') {
  363. requiredScopes.push('read:notifications');
  364. } else {
  365. requiredScopes.push('read:statuses');
  366. }
  367. if (req.scopes && requiredScopes.some(requiredScope => req.scopes.includes(requiredScope))) {
  368. resolve();
  369. return;
  370. }
  371. const err = new Error('Access token does not cover required scopes');
  372. err.status = 401;
  373. reject(err);
  374. });
  375. /**
  376. * @param {any} info
  377. * @param {function(boolean, number, string): void} callback
  378. */
  379. const wsVerifyClient = (info, callback) => {
  380. // When verifying the websockets connection, we no longer pre-emptively
  381. // check OAuth scopes and drop the connection if they're missing. We only
  382. // drop the connection if access without token is not allowed by environment
  383. // variables. OAuth scope checks are moved to the point of subscription
  384. // to a specific stream.
  385. accountFromRequest(info.req).then(() => {
  386. callback(true, undefined, undefined);
  387. }).catch(err => {
  388. log.error(info.req.requestId, err.toString());
  389. callback(false, 401, 'Unauthorized');
  390. });
  391. };
  392. /**
  393. * @typedef SystemMessageHandlers
  394. * @property {function(): void} onKill
  395. */
  396. /**
  397. * @param {any} req
  398. * @param {SystemMessageHandlers} eventHandlers
  399. * @return {function(string): void}
  400. */
  401. const createSystemMessageListener = (req, eventHandlers) => {
  402. return message => {
  403. const json = parseJSON(message, req);
  404. if (!json) return;
  405. const { event } = json;
  406. log.silly(req.requestId, `System message for ${req.accountId}: ${event}`);
  407. if (event === 'kill') {
  408. log.verbose(req.requestId, `Closing connection for ${req.accountId} due to expired access token`);
  409. eventHandlers.onKill();
  410. } else if (event === 'filters_changed') {
  411. log.verbose(req.requestId, `Invalidating filters cache for ${req.accountId}`);
  412. req.cachedFilters = null;
  413. }
  414. };
  415. };
  416. /**
  417. * @param {any} req
  418. * @param {any} res
  419. */
  420. const subscribeHttpToSystemChannel = (req, res) => {
  421. const accessTokenChannelId = `timeline:access_token:${req.accessTokenId}`;
  422. const systemChannelId = `timeline:system:${req.accountId}`;
  423. const listener = createSystemMessageListener(req, {
  424. onKill() {
  425. res.end();
  426. },
  427. });
  428. res.on('close', () => {
  429. unsubscribe(`${redisPrefix}${accessTokenChannelId}`, listener);
  430. unsubscribe(`${redisPrefix}${systemChannelId}`, listener);
  431. });
  432. subscribe(`${redisPrefix}${accessTokenChannelId}`, listener);
  433. subscribe(`${redisPrefix}${systemChannelId}`, listener);
  434. };
  435. /**
  436. * @param {any} req
  437. * @param {any} res
  438. * @param {function(Error=): void} next
  439. */
  440. const authenticationMiddleware = (req, res, next) => {
  441. if (req.method === 'OPTIONS') {
  442. next();
  443. return;
  444. }
  445. accountFromRequest(req).then(() => checkScopes(req, channelNameFromPath(req))).then(() => {
  446. subscribeHttpToSystemChannel(req, res);
  447. }).then(() => {
  448. next();
  449. }).catch(err => {
  450. next(err);
  451. });
  452. };
  453. /**
  454. * @param {Error} err
  455. * @param {any} req
  456. * @param {any} res
  457. * @param {function(Error=): void} next
  458. */
  459. const errorMiddleware = (err, req, res, next) => {
  460. log.error(req.requestId, err.toString());
  461. if (res.headersSent) {
  462. next(err);
  463. return;
  464. }
  465. res.writeHead(err.status || 500, { 'Content-Type': 'application/json' });
  466. res.end(JSON.stringify({ error: err.status ? err.toString() : 'An unexpected error occurred' }));
  467. };
  468. /**
  469. * @param {array} arr
  470. * @param {number=} shift
  471. * @return {string}
  472. */
  473. const placeholders = (arr, shift = 0) => arr.map((_, i) => `$${i + 1 + shift}`).join(', ');
  474. /**
  475. * @param {string} listId
  476. * @param {any} req
  477. * @return {Promise.<void>}
  478. */
  479. const authorizeListAccess = (listId, req) => new Promise((resolve, reject) => {
  480. const { accountId } = req;
  481. pgPool.connect((err, client, done) => {
  482. if (err) {
  483. reject();
  484. return;
  485. }
  486. client.query('SELECT id, account_id FROM lists WHERE id = $1 LIMIT 1', [listId], (err, result) => {
  487. done();
  488. if (err || result.rows.length === 0 || result.rows[0].account_id !== accountId) {
  489. reject();
  490. return;
  491. }
  492. resolve();
  493. });
  494. });
  495. });
  496. /**
  497. * @param {string[]} ids
  498. * @param {any} req
  499. * @param {function(string, string): void} output
  500. * @param {function(string[], function(string): void): void} attachCloseHandler
  501. * @param {boolean=} needsFiltering
  502. * @param {boolean=} allowLocalOnly
  503. * @return {function(string): void}
  504. */
  505. const streamFrom = (ids, req, output, attachCloseHandler, needsFiltering = false, allowLocalOnly = false) => {
  506. const accountId = req.accountId || req.remoteAddress;
  507. log.verbose(req.requestId, `Starting stream from ${ids.join(', ')} for ${accountId}`);
  508. const listener = message => {
  509. const json = parseJSON(message, req);
  510. if (!json) return;
  511. const { event, payload, queued_at } = json;
  512. const transmit = () => {
  513. const now = new Date().getTime();
  514. const delta = now - queued_at;
  515. const encodedPayload = typeof payload === 'object' ? JSON.stringify(payload) : payload;
  516. log.silly(req.requestId, `Transmitting for ${accountId}: ${event} ${encodedPayload} Delay: ${delta}ms`);
  517. output(event, encodedPayload);
  518. };
  519. // Only send local-only statuses to logged-in users
  520. if (event === 'update' && payload.local_only && !(req.accountId && allowLocalOnly)) {
  521. log.silly(req.requestId, `Message ${payload.id} filtered because it was local-only`);
  522. return;
  523. }
  524. // Only messages that may require filtering are statuses, since notifications
  525. // are already personalized and deletes do not matter
  526. if (!needsFiltering || event !== 'update') {
  527. transmit();
  528. return;
  529. }
  530. const unpackedPayload = payload;
  531. const targetAccountIds = [unpackedPayload.account.id].concat(unpackedPayload.mentions.map(item => item.id));
  532. const accountDomain = unpackedPayload.account.acct.split('@')[1];
  533. if (Array.isArray(req.chosenLanguages) && unpackedPayload.language !== null && req.chosenLanguages.indexOf(unpackedPayload.language) === -1) {
  534. log.silly(req.requestId, `Message ${unpackedPayload.id} filtered by language (${unpackedPayload.language})`);
  535. return;
  536. }
  537. // When the account is not logged in, it is not necessary to confirm the block or mute
  538. if (!req.accountId) {
  539. transmit();
  540. return;
  541. }
  542. pgPool.connect((err, client, done) => {
  543. if (err) {
  544. log.error(err);
  545. return;
  546. }
  547. const queries = [
  548. client.query(`SELECT 1
  549. FROM blocks
  550. WHERE (account_id = $1 AND target_account_id IN (${placeholders(targetAccountIds, 2)}))
  551. OR (account_id = $2 AND target_account_id = $1)
  552. UNION
  553. SELECT 1
  554. FROM mutes
  555. WHERE account_id = $1
  556. AND target_account_id IN (${placeholders(targetAccountIds, 2)})`, [req.accountId, unpackedPayload.account.id].concat(targetAccountIds)),
  557. ];
  558. if (accountDomain) {
  559. queries.push(client.query('SELECT 1 FROM account_domain_blocks WHERE account_id = $1 AND domain = $2', [req.accountId, accountDomain]));
  560. }
  561. if (!unpackedPayload.filtered && !req.cachedFilters) {
  562. queries.push(client.query('SELECT filter.id AS id, filter.phrase AS title, filter.context AS context, filter.expires_at AS expires_at, filter.action AS filter_action, keyword.keyword AS keyword, keyword.whole_word AS whole_word FROM custom_filter_keywords keyword JOIN custom_filters filter ON keyword.custom_filter_id = filter.id WHERE filter.account_id = $1 AND (filter.expires_at IS NULL OR filter.expires_at > NOW())', [req.accountId]));
  563. }
  564. Promise.all(queries).then(values => {
  565. done();
  566. if (values[0].rows.length > 0 || (accountDomain && values[1].rows.length > 0)) {
  567. return;
  568. }
  569. if (!unpackedPayload.filtered && !req.cachedFilters) {
  570. const filterRows = values[accountDomain ? 2 : 1].rows;
  571. req.cachedFilters = filterRows.reduce((cache, row) => {
  572. if (cache[row.id]) {
  573. cache[row.id].keywords.push([row.keyword, row.whole_word]);
  574. } else {
  575. cache[row.id] = {
  576. keywords: [[row.keyword, row.whole_word]],
  577. expires_at: row.expires_at,
  578. repr: {
  579. id: row.id,
  580. title: row.title,
  581. context: row.context,
  582. expires_at: row.expires_at,
  583. filter_action: ['warn', 'hide'][row.filter_action],
  584. },
  585. };
  586. }
  587. return cache;
  588. }, {});
  589. Object.keys(req.cachedFilters).forEach((key) => {
  590. req.cachedFilters[key].regexp = new RegExp(req.cachedFilters[key].keywords.map(([keyword, whole_word]) => {
  591. let expr = keyword.replace(/[.*+?^${}()|[\]\\]/g, '\\$&');
  592. if (whole_word) {
  593. if (/^[\w]/.test(expr)) {
  594. expr = `\\b${expr}`;
  595. }
  596. if (/[\w]$/.test(expr)) {
  597. expr = `${expr}\\b`;
  598. }
  599. }
  600. return expr;
  601. }).join('|'), 'i');
  602. });
  603. }
  604. // Check filters
  605. if (req.cachedFilters && !unpackedPayload.filtered) {
  606. const status = unpackedPayload;
  607. const searchContent = ([status.spoiler_text || '', status.content].concat((status.poll && status.poll.options) ? status.poll.options.map(option => option.title) : [])).concat(status.media_attachments.map(att => att.description)).join('\n\n').replace(/<br\s*\/?>/g, '\n').replace(/<\/p><p>/g, '\n\n');
  608. const searchIndex = JSDOM.fragment(searchContent).textContent;
  609. const now = new Date();
  610. payload.filtered = [];
  611. Object.values(req.cachedFilters).forEach((cachedFilter) => {
  612. if ((cachedFilter.expires_at === null || cachedFilter.expires_at > now)) {
  613. const keyword_matches = searchIndex.match(cachedFilter.regexp);
  614. if (keyword_matches) {
  615. payload.filtered.push({
  616. filter: cachedFilter.repr,
  617. keyword_matches,
  618. });
  619. }
  620. }
  621. });
  622. }
  623. transmit();
  624. }).catch(err => {
  625. log.error(err);
  626. done();
  627. });
  628. });
  629. };
  630. ids.forEach(id => {
  631. subscribe(`${redisPrefix}${id}`, listener);
  632. });
  633. if (attachCloseHandler) {
  634. attachCloseHandler(ids.map(id => `${redisPrefix}${id}`), listener);
  635. }
  636. return listener;
  637. };
  638. /**
  639. * @param {any} req
  640. * @param {any} res
  641. * @return {function(string, string): void}
  642. */
  643. const streamToHttp = (req, res) => {
  644. const accountId = req.accountId || req.remoteAddress;
  645. res.setHeader('Content-Type', 'text/event-stream');
  646. res.setHeader('Cache-Control', 'no-store');
  647. res.setHeader('Transfer-Encoding', 'chunked');
  648. res.write(':)\n');
  649. const heartbeat = setInterval(() => res.write(':thump\n'), 15000);
  650. req.on('close', () => {
  651. log.verbose(req.requestId, `Ending stream for ${accountId}`);
  652. clearInterval(heartbeat);
  653. });
  654. return (event, payload) => {
  655. res.write(`event: ${event}\n`);
  656. res.write(`data: ${payload}\n\n`);
  657. };
  658. };
  659. /**
  660. * @param {any} req
  661. * @param {function(): void} [closeHandler]
  662. * @return {function(string[]): void}
  663. */
  664. const streamHttpEnd = (req, closeHandler = undefined) => (ids) => {
  665. req.on('close', () => {
  666. ids.forEach(id => {
  667. unsubscribe(id);
  668. });
  669. if (closeHandler) {
  670. closeHandler();
  671. }
  672. });
  673. };
  674. /**
  675. * @param {any} req
  676. * @param {any} ws
  677. * @param {string[]} streamName
  678. * @return {function(string, string): void}
  679. */
  680. const streamToWs = (req, ws, streamName) => (event, payload) => {
  681. if (ws.readyState !== ws.OPEN) {
  682. log.error(req.requestId, 'Tried writing to closed socket');
  683. return;
  684. }
  685. ws.send(JSON.stringify({ stream: streamName, event, payload }));
  686. };
  687. /**
  688. * @param {any} res
  689. */
  690. const httpNotFound = res => {
  691. res.writeHead(404, { 'Content-Type': 'application/json' });
  692. res.end(JSON.stringify({ error: 'Not found' }));
  693. };
  694. app.use(setRequestId);
  695. app.use(setRemoteAddress);
  696. app.use(allowCrossDomain);
  697. app.get('/api/v1/streaming/health', (req, res) => {
  698. res.writeHead(200, { 'Content-Type': 'text/plain' });
  699. res.end('OK');
  700. });
  701. app.get('/metrics', (req, res) => server.getConnections((err, count) => {
  702. res.writeHeader(200, { 'Content-Type': 'application/openmetrics-text; version=1.0.0; charset=utf-8' });
  703. res.write('# TYPE connected_clients gauge\n');
  704. res.write('# HELP connected_clients The number of clients connected to the streaming server\n');
  705. res.write(`connected_clients ${count}.0\n`);
  706. res.write('# TYPE connected_channels gauge\n');
  707. res.write('# HELP connected_channels The number of Redis channels the streaming server is subscribed to\n');
  708. res.write(`connected_channels ${Object.keys(subs).length}.0\n`);
  709. res.write('# TYPE pg_pool_total_connections gauge\n');
  710. res.write('# HELP pg_pool_total_connections The total number of clients existing within the pool\n');
  711. res.write(`pg_pool_total_connections ${pgPool.totalCount}.0\n`);
  712. res.write('# TYPE pg_pool_idle_connections gauge\n');
  713. res.write('# HELP pg_pool_idle_connections The number of clients which are not checked out but are currently idle in the pool\n');
  714. res.write(`pg_pool_idle_connections ${pgPool.idleCount}.0\n`);
  715. res.write('# TYPE pg_pool_waiting_queries gauge\n');
  716. res.write('# HELP pg_pool_waiting_queries The number of queued requests waiting on a client when all clients are checked out\n');
  717. res.write(`pg_pool_waiting_queries ${pgPool.waitingCount}.0\n`);
  718. res.write('# EOF\n');
  719. res.end();
  720. }));
  721. app.use(authenticationMiddleware);
  722. app.use(errorMiddleware);
  723. app.get('/api/v1/streaming/*', (req, res) => {
  724. channelNameToIds(req, channelNameFromPath(req), req.query).then(({ channelIds, options }) => {
  725. const onSend = streamToHttp(req, res);
  726. const onEnd = streamHttpEnd(req, subscriptionHeartbeat(channelIds));
  727. streamFrom(channelIds, req, onSend, onEnd, options.needsFiltering, options.allowLocalOnly);
  728. }).catch(err => {
  729. log.verbose(req.requestId, 'Subscription error:', err.toString());
  730. httpNotFound(res);
  731. });
  732. });
  733. const wss = new WebSocket.Server({ server, verifyClient: wsVerifyClient });
  734. /**
  735. * @typedef StreamParams
  736. * @property {string} [tag]
  737. * @property {string} [list]
  738. * @property {string} [only_media]
  739. */
  740. /**
  741. * @param {any} req
  742. * @return {string[]}
  743. */
  744. const channelsForUserStream = req => {
  745. const arr = [`timeline:${req.accountId}`];
  746. if (isInScope(req, ['crypto']) && req.deviceId) {
  747. arr.push(`timeline:${req.accountId}:${req.deviceId}`);
  748. }
  749. if (isInScope(req, ['read', 'read:notifications'])) {
  750. arr.push(`timeline:${req.accountId}:notifications`);
  751. }
  752. return arr;
  753. };
  754. /**
  755. * See app/lib/ascii_folder.rb for the canon definitions
  756. * of these constants
  757. */
  758. const NON_ASCII_CHARS = 'ÀÁÂÃÄÅàáâãäåĀāĂ㥹ÇçĆćĈĉĊċČčÐðĎďĐđÈÉÊËèéêëĒēĔĕĖėĘęĚěĜĝĞğĠġĢģĤĥĦħÌÍÎÏìíîïĨĩĪīĬĭĮįİıĴĵĶķĸĹĺĻļĽľĿŀŁłÑñŃńŅņŇňʼnŊŋÒÓÔÕÖØòóôõöøŌōŎŏŐőŔŕŖŗŘřŚśŜŝŞşŠšſŢţŤťŦŧÙÚÛÜùúûüŨũŪūŬŭŮůŰűŲųŴŵÝýÿŶŷŸŹźŻżŽž';
  759. const EQUIVALENT_ASCII_CHARS = 'AAAAAAaaaaaaAaAaAaCcCcCcCcCcDdDdDdEEEEeeeeEeEeEeEeEeGgGgGgGgHhHhIIIIiiiiIiIiIiIiIiJjKkkLlLlLlLlLlNnNnNnNnnNnOOOOOOooooooOoOoOoRrRrRrSsSsSsSssTtTtTtUUUUuuuuUuUuUuUuUuUuWwYyyYyYZzZzZz';
  760. /**
  761. * @param {string} str
  762. * @return {string}
  763. */
  764. const foldToASCII = str => {
  765. const regex = new RegExp(NON_ASCII_CHARS.split('').join('|'), 'g');
  766. return str.replace(regex, match => {
  767. const index = NON_ASCII_CHARS.indexOf(match);
  768. return EQUIVALENT_ASCII_CHARS[index];
  769. });
  770. };
  771. /**
  772. * @param {string} str
  773. * @return {string}
  774. */
  775. const normalizeHashtag = str => {
  776. return foldToASCII(str.normalize('NFKC').toLowerCase()).replace(/[^\p{L}\p{N}_\u00b7\u200c]/gu, '');
  777. };
  778. /**
  779. * @param {any} req
  780. * @param {string} name
  781. * @param {StreamParams} params
  782. * @return {Promise.<{ channelIds: string[], options: { needsFiltering: boolean } }>}
  783. */
  784. const channelNameToIds = (req, name, params) => new Promise((resolve, reject) => {
  785. switch (name) {
  786. case 'user':
  787. resolve({
  788. channelIds: channelsForUserStream(req),
  789. options: { needsFiltering: false, allowLocalOnly: true },
  790. });
  791. break;
  792. case 'user:notification':
  793. resolve({
  794. channelIds: [`timeline:${req.accountId}:notifications`],
  795. options: { needsFiltering: false, allowLocalOnly: true },
  796. });
  797. break;
  798. case 'public':
  799. resolve({
  800. channelIds: ['timeline:public'],
  801. options: { needsFiltering: true, allowLocalOnly: isTruthy(params.allow_local_only) },
  802. });
  803. break;
  804. case 'public:allow_local_only':
  805. resolve({
  806. channelIds: ['timeline:public'],
  807. options: { needsFiltering: true, allowLocalOnly: true },
  808. });
  809. break;
  810. case 'public:local':
  811. resolve({
  812. channelIds: ['timeline:public:local'],
  813. options: { needsFiltering: true, allowLocalOnly: true },
  814. });
  815. break;
  816. case 'public:remote':
  817. resolve({
  818. channelIds: ['timeline:public:remote'],
  819. options: { needsFiltering: true, allowLocalOnly: false },
  820. });
  821. break;
  822. case 'public:media':
  823. resolve({
  824. channelIds: ['timeline:public:media'],
  825. options: { needsFiltering: true, allowLocalOnly: isTruthy(query.allow_local_only) },
  826. });
  827. break;
  828. case 'public:allow_local_only:media':
  829. resolve({
  830. channelIds: ['timeline:public:media'],
  831. options: { needsFiltering: true, allowLocalOnly: true },
  832. });
  833. break;
  834. case 'public:local:media':
  835. resolve({
  836. channelIds: ['timeline:public:local:media'],
  837. options: { needsFiltering: true, allowLocalOnly: true },
  838. });
  839. break;
  840. case 'public:remote:media':
  841. resolve({
  842. channelIds: ['timeline:public:remote:media'],
  843. options: { needsFiltering: true, allowLocalOnly: false },
  844. });
  845. break;
  846. case 'direct':
  847. resolve({
  848. channelIds: [`timeline:direct:${req.accountId}`],
  849. options: { needsFiltering: false, allowLocalOnly: true },
  850. });
  851. break;
  852. case 'hashtag':
  853. if (!params.tag || params.tag.length === 0) {
  854. reject('No tag for stream provided');
  855. } else {
  856. resolve({
  857. channelIds: [`timeline:hashtag:${normalizeHashtag(params.tag)}`],
  858. options: { needsFiltering: true, allowLocalOnly: true },
  859. });
  860. }
  861. break;
  862. case 'hashtag:local':
  863. if (!params.tag || params.tag.length === 0) {
  864. reject('No tag for stream provided');
  865. } else {
  866. resolve({
  867. channelIds: [`timeline:hashtag:${normalizeHashtag(params.tag)}:local`],
  868. options: { needsFiltering: true, allowLocalOnly: true },
  869. });
  870. }
  871. break;
  872. case 'list':
  873. authorizeListAccess(params.list, req).then(() => {
  874. resolve({
  875. channelIds: [`timeline:list:${params.list}`],
  876. options: { needsFiltering: false, allowLocalOnly: true },
  877. });
  878. }).catch(() => {
  879. reject('Not authorized to stream this list');
  880. });
  881. break;
  882. default:
  883. reject('Unknown stream type');
  884. }
  885. });
  886. /**
  887. * @param {string} channelName
  888. * @param {StreamParams} params
  889. * @return {string[]}
  890. */
  891. const streamNameFromChannelName = (channelName, params) => {
  892. if (channelName === 'list') {
  893. return [channelName, params.list];
  894. } else if (['hashtag', 'hashtag:local'].includes(channelName)) {
  895. return [channelName, params.tag];
  896. } else {
  897. return [channelName];
  898. }
  899. };
  900. /**
  901. * @typedef WebSocketSession
  902. * @property {any} socket
  903. * @property {any} request
  904. * @property {Object.<string, { listener: function(string): void, stopHeartbeat: function(): void }>} subscriptions
  905. */
  906. /**
  907. * @param {WebSocketSession} session
  908. * @param {string} channelName
  909. * @param {StreamParams} params
  910. */
  911. const subscribeWebsocketToChannel = ({ socket, request, subscriptions }, channelName, params) =>
  912. checkScopes(request, channelName).then(() => channelNameToIds(request, channelName, params)).then(({
  913. channelIds,
  914. options,
  915. }) => {
  916. if (subscriptions[channelIds.join(';')]) {
  917. return;
  918. }
  919. const onSend = streamToWs(request, socket, streamNameFromChannelName(channelName, params));
  920. const stopHeartbeat = subscriptionHeartbeat(channelIds);
  921. const listener = streamFrom(channelIds, request, onSend, undefined, options.needsFiltering, options.allowLocalOnly);
  922. subscriptions[channelIds.join(';')] = {
  923. listener,
  924. stopHeartbeat,
  925. };
  926. }).catch(err => {
  927. log.verbose(request.requestId, 'Subscription error:', err.toString());
  928. socket.send(JSON.stringify({ error: err.toString() }));
  929. });
  930. /**
  931. * @param {WebSocketSession} session
  932. * @param {string} channelName
  933. * @param {StreamParams} params
  934. */
  935. const unsubscribeWebsocketFromChannel = ({ socket, request, subscriptions }, channelName, params) =>
  936. channelNameToIds(request, channelName, params).then(({ channelIds }) => {
  937. log.verbose(request.requestId, `Ending stream from ${channelIds.join(', ')} for ${request.accountId}`);
  938. const subscription = subscriptions[channelIds.join(';')];
  939. if (!subscription) {
  940. return;
  941. }
  942. const { listener, stopHeartbeat } = subscription;
  943. channelIds.forEach(channelId => {
  944. unsubscribe(`${redisPrefix}${channelId}`, listener);
  945. });
  946. stopHeartbeat();
  947. delete subscriptions[channelIds.join(';')];
  948. }).catch(err => {
  949. log.verbose(request.requestId, 'Unsubscription error:', err);
  950. socket.send(JSON.stringify({ error: err.toString() }));
  951. });
  952. /**
  953. * @param {WebSocketSession} session
  954. */
  955. const subscribeWebsocketToSystemChannel = ({ socket, request, subscriptions }) => {
  956. const accessTokenChannelId = `timeline:access_token:${request.accessTokenId}`;
  957. const systemChannelId = `timeline:system:${request.accountId}`;
  958. const listener = createSystemMessageListener(request, {
  959. onKill() {
  960. socket.close();
  961. },
  962. });
  963. subscribe(`${redisPrefix}${accessTokenChannelId}`, listener);
  964. subscribe(`${redisPrefix}${systemChannelId}`, listener);
  965. subscriptions[accessTokenChannelId] = {
  966. listener,
  967. stopHeartbeat: () => {
  968. },
  969. };
  970. subscriptions[systemChannelId] = {
  971. listener,
  972. stopHeartbeat: () => {
  973. },
  974. };
  975. };
  976. /**
  977. * @param {string|string[]} arrayOrString
  978. * @return {string}
  979. */
  980. const firstParam = arrayOrString => {
  981. if (Array.isArray(arrayOrString)) {
  982. return arrayOrString[0];
  983. } else {
  984. return arrayOrString;
  985. }
  986. };
  987. wss.on('connection', (ws, req) => {
  988. const location = url.parse(req.url, true);
  989. req.requestId = uuid.v4();
  990. req.remoteAddress = ws._socket.remoteAddress;
  991. ws.isAlive = true;
  992. ws.on('pong', () => {
  993. ws.isAlive = true;
  994. });
  995. /**
  996. * @type {WebSocketSession}
  997. */
  998. const session = {
  999. socket: ws,
  1000. request: req,
  1001. subscriptions: {},
  1002. };
  1003. const onEnd = () => {
  1004. const keys = Object.keys(session.subscriptions);
  1005. keys.forEach(channelIds => {
  1006. const { listener, stopHeartbeat } = session.subscriptions[channelIds];
  1007. channelIds.split(';').forEach(channelId => {
  1008. unsubscribe(`${redisPrefix}${channelId}`, listener);
  1009. });
  1010. stopHeartbeat();
  1011. });
  1012. };
  1013. ws.on('close', onEnd);
  1014. ws.on('error', onEnd);
  1015. ws.on('message', data => {
  1016. const json = parseJSON(data, session.request);
  1017. if (!json) return;
  1018. const { type, stream, ...params } = json;
  1019. if (type === 'subscribe') {
  1020. subscribeWebsocketToChannel(session, firstParam(stream), params);
  1021. } else if (type === 'unsubscribe') {
  1022. unsubscribeWebsocketFromChannel(session, firstParam(stream), params);
  1023. } else {
  1024. // Unknown action type
  1025. }
  1026. });
  1027. subscribeWebsocketToSystemChannel(session);
  1028. if (location.query.stream) {
  1029. subscribeWebsocketToChannel(session, firstParam(location.query.stream), location.query);
  1030. }
  1031. });
  1032. setInterval(() => {
  1033. wss.clients.forEach(ws => {
  1034. if (ws.isAlive === false) {
  1035. ws.terminate();
  1036. return;
  1037. }
  1038. ws.isAlive = false;
  1039. ws.ping('', false);
  1040. });
  1041. }, 30000);
  1042. attachServerWithConfig(server, address => {
  1043. log.warn(`Worker ${workerId} now listening on ${address}`);
  1044. });
  1045. const onExit = () => {
  1046. log.warn(`Worker ${workerId} exiting`);
  1047. server.close();
  1048. process.exit(0);
  1049. };
  1050. const onError = (err) => {
  1051. log.error(err);
  1052. server.close();
  1053. process.exit(0);
  1054. };
  1055. process.on('SIGINT', onExit);
  1056. process.on('SIGTERM', onExit);
  1057. process.on('exit', onExit);
  1058. process.on('uncaughtException', onError);
  1059. };
  1060. /**
  1061. * @param {any} server
  1062. * @param {function(string): void} [onSuccess]
  1063. */
  1064. const attachServerWithConfig = (server, onSuccess) => {
  1065. if (process.env.SOCKET || process.env.PORT && isNaN(+process.env.PORT)) {
  1066. server.listen(process.env.SOCKET || process.env.PORT, () => {
  1067. if (onSuccess) {
  1068. fs.chmodSync(server.address(), 0o666);
  1069. onSuccess(server.address());
  1070. }
  1071. });
  1072. } else {
  1073. server.listen(+process.env.PORT || 4000, process.env.BIND || '127.0.0.1', () => {
  1074. if (onSuccess) {
  1075. onSuccess(`${server.address().address}:${server.address().port}`);
  1076. }
  1077. });
  1078. }
  1079. };
  1080. /**
  1081. * @param {function(Error=): void} onSuccess
  1082. */
  1083. const onPortAvailable = onSuccess => {
  1084. const testServer = http.createServer();
  1085. testServer.once('error', err => {
  1086. onSuccess(err);
  1087. });
  1088. testServer.once('listening', () => {
  1089. testServer.once('close', () => onSuccess());
  1090. testServer.close();
  1091. });
  1092. attachServerWithConfig(testServer);
  1093. };
  1094. onPortAvailable(err => {
  1095. if (err) {
  1096. log.error('Could not start server, the port or socket is in use');
  1097. return;
  1098. }
  1099. throng({
  1100. workers: numWorkers,
  1101. lifetime: Infinity,
  1102. start: startWorker,
  1103. master: startMaster,
  1104. });
  1105. });