divmod-users team mailing list archive
-
divmod-users team
-
Mailing list archive
-
Message #00001
How should I shard my Axiom stores?
Hey,
I'm trying to consider what the appropriate way to shard the stores for my application.
There's a large (let's say 1E5) number of users. Each user has a bit of information (name, email, password, billing service reference, a short profile: all in all no more than a few kB). Each user has a number of conversations (let's assume these are 1-on-1 for now) with a number of other users. Each user has less than 1E2 such conversations.
One of the first suggestions was to have one store per user. That's how Mantissa does things by default. I don't think this is an appropriate solution for my problem, because I want to be able to query users (for example, to find people in particular locations or age categories). Obviously, querying all of those stores is a lot harder than leveraging the tools SQLite already provides for me. Sharding these out by user id and then writing some map-reducy code to eventually make this scale across cores may be a solution -- but then I'd have O(server_load) stores, not O(users) stores -- and certainly not exactly the same number of stores as users.
Right now, I think the best way to handle this is a sort of address book store that remembers which hosts have which data, and to shard both users and conversations based on id. So, you'd have N stores with users, N stores with conversations/messages, and 1 store to tell you which store to find things in.
Alternatively I could put all messages sent to a particular user in one store (potentially shared with that user: a sort of inbox -- assuming I can fix the querying problem). This would also make it easier to implement IRC-style channels, later. I can reconstruct the conversation between two users A and B (as user A) as follows:
1. Query A's inbox for messages sent by B
2. Query B's inbox for messages sent by A
… and then sort the two.
I have no experience with Axiom beyond the trivial hello-world, so any guidance is appreciated. Thanks in advance!
cheers
lvh