DEV Community

Discussion on: Java may be verbose, but who cares?

 
danlebrero profile image
Dan Lebrero

Hi Nicolas,

Thanks a lot for the thoughtful answer.

We also have the problem that sometimes when adding a new field or adding a new enum value, we need to change several services.

In my experience this usually happens because we autogenerate our Java classes from some schema, which makes our classes too rigid.

I agree that microservices are more complex, but also more flexible.

Would building microservices inside a monolith give us the best of both worlds?

Thread Thread
 
bousquetn profile image
Nicolas Bousquet

To me, these no silver bullet. Micro-service is just a way to componentize an application/product. It is great for some cases, terrible for others.

I would not consider microservices to componentize the plugins of a image editing software on the desktop as an example ;)

To me using micro services, fat service, a single service or no service at all, network wise is not really important. What is important is that your components are well componentized.

Example: You or me may say that service is XXX LOCs. But we do not include the JVM code. Neither do we include apache-common libs or spring, or the application server (or netty) code. If we use clojure we do not include that. And we consider only the real source code, not the compiled code...

This is because we manage to extremely well componentize theses components. The abstraction they provide is so great you never have to look inside how it work. You can, it has value, but you don't have to.

Often even using the network there more coupling that we think between components. The format of data we use while in JSON or XML is far less generic than we think because if we try to incomporate the data from another provider we may miss some concept entirely and the exchange format has to be reviewed.

Theses things are hard. Networks service allows to have thing on different processes/computer and that's nice. The resiliant format like JSON/XML help on adding new features without breaking existing code. As long as clients are smart enough to ignore what they don't know in the message.

But you can achieve with services defined as interface in java (or clojure). The idea are quite simple in the end. The input is created by the client and the one that receive it do no modify it. The output is created by the service and that service ensure there no dependency to its internal state that would create issues.

The service doesn't take parameters as adding more break client code and there a narrow limit of what you can pass but use more POJOs or clojure maps. In both cases, it is possible to extends the datastructure and consider you have default values for things that are missing (with pojo, the default value can be always present).

Debugging inside the same process and performing integration tests is much easier and faster and you remove entirely the need for serialization/deserialization and you don't have to manage the various network errors...

Don't get me wrong you'll not want necessarily everything in the same VM neither. It depends of the problem you are trying to solve. A good architecture would bring natural boundaries and some of theses boudaries would be the network.

But the network itself bring lot of complexities. I can see that clearly. In my company we have thousand and thousand services. Many of theses services are what you would call large codebases and we have many farms of servers. Fast it become complex to understand what the right service to call, what dependent service will be impacted when you want to add an new feature, how to scale and keep the latency low, how to scale the network itself...

One day or another, the comforts of the abstraction that protected us for years come back to bite us ;) There hundred people in my company working on the enterprize bus, helping the configuration and maintenances of all services for various clients and so on.

Thread Thread
 
danlebrero profile image
Dan Lebrero

Hi Nicolas,

All very true.

The point that I was trying to make with the micro services is that I have found that they give you better components, I think because it is usually another team the one that produces them and several teams consume them, hence it forces for a more isolated and thoughtful design, it forces to make things backwards compatible and it also makes the boundaries obvious.

I do not think they make things simpler. As you say, the network is a huge headache on its own, but I think they force us to follow good practices.

On the other hand, with monoliths I find it easier to put some hack, break encapsulation and to end up with a big ball of mud.

Of course, I have also seen the death star from Netflix :)

Thanks a lot!

Dan

Thread Thread
 
bousquetn profile image
Nicolas Bousquet • Edited

I agree that network services on a resiliant exchange format (like XML, protobuff, Json...) greatly help on the backward compatible aspect and if you are serious about the doc, versioning and all it great way to isolate a component.

I always found componentization to be extremely hard to achieve. A web service can be very fast become non backward compatible, expose proprietary format or a structure of data that is incomplete or not future proof. The cost of maintenance is then huge.

When you are in the same process, the issues are far easier to solve but also are far easier to make and once there too many, nobody can manage to remove them. This is because languages at least like C, java or clojure are not really solving modularity issues.

Java 9 is going to give a try to a module system and OSGI has been there for quite some time in the eclipse echosystem. But that last one is not easy to use.

What we do in our project there is that each module in their simplest form have 3 maven modules: an interface module, an implementation module and an aggregator module that depends on both: one with the standard compile scope, the second with the runtime scope.

Clients code depend on the aggregator. They perfectly see the public interface but can't statically reference the implementation module as it is not included in the classpath at compilation time.

Linking is done either by spring (typically with annotations) or for components that are not expected to be bound to spring, by a singleton in the interface that access the implementation by introspection. Typically in a ServiceLocator or equivalent way.

The implementation can implement any interface, have any kind of public/protected visibility and clients can't abuse that without doing it on purpose (moving the class, making the access public or adding a compile time dependency to impl). Go to separate git (for a group of modules arround a feature likely) and it become harder to go unnoticed.

What we still miss but I guess could be added is a build failure in maven if you try for another scope than runtime on an "impl" module.

Thread Thread
 
danlebrero profile image
Dan Lebrero

I worked with the OSGi for four years back in the early days and it is one of the best frameworks to force you to think about how to componetize your application.

But the most useful lesson was to think about how your system should behave if one of the component was not present or was being restarted/upgraded.

It is an experience that translated quite nicely to the micro services world and that still helps me every time that we build a new system. Somebody should write a book called "What happens if this dependency is not available?".

What the OSGi did not taught me was that the network is a PITA. I learned that latter :)

Thanks!

Dan

Thread Thread
 
mohr023 profile image
Matheus Mohr

I just wanted to say it was a pleasure to watch your discussion guys haha

Thread Thread
 
danlebrero profile image
Dan Lebrero

Kudos to Nicolas!