Ran across an article that says core services are a bad idea. Given my background in infrastructure and things like BatchAPI I got a little upset at the title. But I’m a tolerant person and open to learning, so I decided to read the article.
Turns out I don’t disagree with the article, but I’d probably give it a different title. What the author is calling Core Services is what I’d call shared global data. It’s the difference between platforms that services are built upon and services that are built into other services. BatchAPI, and platforms in general, come under the heading of Domain Expertise. The author says, and I agree with them, that it does make sense to have a centralized, core team to provide that expertise as a core platform.
On the other hand, consider the “User” database. Sure, there’s a service in front of the database, but really it’s just global data masquerading as an RPC call. It’s not that the data shouldn’t be shared, it’s that direct, synchronous sharing like that adds all sorts of complexity and cross domain coupling that’s not always needed. Sometimes it’s needed. When you need immediate consistency is the biggest reason. You’ll need to design for those cases, but in most cases it’s not really needed. RPCs and user interactions are already asynchronous, and there’s nothing you can do to prevent the user from changing their mailing address for their account milliseconds after you print the address on the monthly billing statement. You need to handle that some other way anyway.
The idea is that by defining strong bounded contexts and only crossing the boundary when it's truly needed you decrease design and development time coupling. Reducing that coupling increases velocity, which means producing more value sooner.