artsy / eigen

The Art World in Your Pocket or Your Trendy Tech Company's Tote, Artsy's mobile app.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[RFC] Improved search in eigen

gkartalis opened this issue · comments

Status: Closed

Type: New Dependencies

Name: react-instantsearch-native
github url
npmjs url

Name: algoliasearch
github url
npmjs url

Description

In FX team we want to integrate algolia in order to provide a richer search experience within the mobile app - backed by the Algolia platform (You can find more info in the tech plan here).

Approaches

There are two approaches we can take in order to do that in eigen currently.

  1. Use algoliasearch api client directly from the eigen app together with their component library react-instantsearch-native (docs) that provides some react HOC's and components to help us build the UI (they don't have support for hooks YET)

    This would require us to install the following 3 dependencies in eigen:

    • react-instantsearch-native
    • @types/react-instantsearch-native
    • algoliasearch

    A small POC of this can be found in this draft PR

  2. Don't query algolia directly but instead expose a GraphQL Interface from metaphysics which will proxy requests to Algolia. No dependency installation needed in eigen.

Additional context

The mobile feature will likely be an extension of what's the app today, not a replacement. You'll still be able to use the existing global search but can click in to a deeper search experience where you can filter by entity and apply filters.

slack discussion

commented

Sounds very reasonable to me! I also don't know much about mobile search libraries, so I have nothing against this 😄

Thanks for opening @gkartalis!

Just want to add that we'd prefer to pull in the full react-instantsearch-native dependency so that we can use its UI components as recommended by Algolia:

If you’re building a web application, it’s recommended to use one of Algolia’s front-end search UI libraries instead of using the API client directly.

Algolia’s InstantSearch UI libraries offer an out of the box, good-looking and customizable search UI with instant results, unlimited facets and filters, and more configurable features. By using InstantSearch, you get better response times, because the requests go directly from your end users to the Algolia servers. This also lowers the burden on your servers from real-time search activity.

source: https://www.algolia.com/doc/api-client/methods/search/

Don't query algolia directly but instead expose a GraphQL Interface from metaphysics which will proxy requests to Algolia. No dependency installation needed in eigen.

If I understand correctly, this would likely be much slower, right? Instead of hitting Algolia directly - and I'm assuming their components are highly optimized - we would be building components from scratch which rely on sending a request to Metaphysics, which then hits Algolia, which returns a result, which is passed through Metaphysics and finally to the client. So to @dblandin's point above, seems much better to introduce these dependencies.

We're confident we want to use Algolia for search for the foreseeable future. If this were a quick hack exploration and we hadn't just signed a contract with them, then maybe we would want to avoid new deps, but we've signed a contract and we're paying them for what they offer. We should commit to that!

This seems reasonable to me. I think this will help us definitely provide a better search experience and get a higher conversion. I also agree with picking the first approach but I do have some open questions about it:

  • If at some point in the future we decide to move on from Algolia to a better vendor or an in-house built solution, can we just flip the switch or we will have to keep our contract with Algolia active until the majority of our users update their apps.
  • Would using Algolia to "outsource" our searches affect our recommendations? I mean are we using the data the user searched for in our recommendation engine somewhere and would removing that data affect our artwork recommendations?

Edit:
I know you said

The mobile feature will likely be an extension of what's the app today, not a replacement. You'll still be able to use the existing global search but can click in to a deeper search experience where you can filter by entity and apply filters.

but this is still not clear to me. Do you mind explaining how this will be

Thanks for the feedback!

If at some point in the future we decide to move on from Algolia to a better vendor or an in-house built solution, can we just flip the switch or we will have to keep our contract with Algolia active until the majority of our users update their apps.

Not sure if I can reply regarding the contract, but the way that this feature will be implemented in the beginning would be the following:
Flipping the switch toggles between the two search screens and that would mean that if the switch is turned off the users in eigen won't have access to the improved search screen thus 0 requests to algolia. So by keeping the feature flag in place even after releasing the feature will allow us to be able to stop using this 3rd party for whatever reason and whenever we want.

Would using Algolia to "outsource" our searches affect our recommendations? I mean are we using the data the user searched for in our recommendation engine somewhere and would removing that data affect our artwork recommendations?

Good point! Getting some extra context about how exactly our recommendations work and if they take search queries in mind and will get back at you soon.

The mobile feature will likely be an extension of what's the app today, not a replacement. You'll still be able to use the existing global search but can click in to a deeper search experience where you can filter by entity and apply filters.

In the beginning we will start using this step by step, and what I mean is that we are going to have our multi index search backed from gravity and we will use algolia to "enhance" our searches on specific indices (i.e. Artists) that will enable us to provide a faster and more precise search depending on attributes that every indice would have.

We also have this WIP figma.

By pressing any of the pills we will be able to switch between indices with the same query and get faster results depending on what the user is searching for.

cc @dblandin feel free to chime in and add additional context that I might be missing!

+1 to using Algolia clients to query directly, if possible, in order to see the greatest speed benefits from their platform

Would using Algolia to "outsource" our searches affect our recommendations? I mean are we using the data the user searched for in our recommendation engine somewhere and would removing that data affect our artwork recommendations?

Good point! Getting some extra context about how exactly our recommendations work and if they take search queries in mind and will get back at you soon.

In case you're still looking for this info @gkartalis, the various ingredients in our homegrown recommender system are laid out here in Cinder (our Apache Spark system for off-line big data processing).

Search history is not factored in currently, so I see no risk there. And, in fact this may potentially unlock some recommendation options that are powered by Algolia

In case you're still looking for this info @gkartalis, the various ingredients in our homegrown recommender system are laid out here in Cinder (our Apache Spark system for off-line big data processing).

Thanks @anandaroop really useful link, taking a look now!

Sounds like we can close this and go ahead and implement this using the first approach.

This is the related PR: #5254

@francoischalifour thanks for letting us know! Will definitely give em a try! 🚀 🎉