#The Problem We just described standard design issues you have when you start creating layers of services, DAOs and other components to implement an application. That blog/gist is here.
The goal is to think through some designs in order to develop something useful for an application.
#Working through Layers If you compose services and DAOs the normal way, you typically get imperative style objects. For example, imagine the following:
object DomainObjects {
type UserId = Long;
case class User(id: UserId, properties: Map[String, Any])
}
import DomainObjects._
/**
* The service layer. Most service layers define the unit of work. Many
* times a unit of work in the service layer is the same as that implemented
* in the DAO layer, but that's because some internet examples are too small
* to show the nuanances. Since this layer defines units of work, this is where
* the transaction strategy is also implemented. Spring implements this with
* annotations and proxies/byte-code enhancers. We'll use explicit coding because we are not
* using proxies/byte-code enhancers like spring uses.
*
* Classic cake pattern. Any implementation must define
* a service val (or directly create an object) to satisfy
* this abstract type.
*/
trait UserServiceComponent {
val service: UserService
trait UserService {
def updateName(User: UserId, newName: String): Either[String, User]
def findById(name: UserId): Option[User]
}
}
/**
* This is the usual "interface" declaration.
*
*/
trait UserDaoComponent {
val userDao: UserDao
trait UserDao {
/**
* Find a user by their id.
*/
def findById(user: UserId): Option[User]
/**
* Update the user properties, whatever has changed.
*/
def update(user: User): Unit
}
}
/**
* The simple application auditing component.
*/
trait AuditDaoComponent {
/**
* We define a def here. If you want singleton behavior,
* define a private member _auditDao, instantiate that
* then return it in the def. If you want a new auditDao
* each time, create a new one each time this def is called.
* DAOs traditionally do not hold any state so it which you
* choose is not too critical for most applications but the
* option is demonstated here by using a def instead of a val.
*/
def auditDao: AuditDao
trait AuditDao {
/**
* Send the changes to an audit log somewhere.
*/
def auditChange(user: User, changedProperties: Seq[String]): Unit
}
}The next step would be to create the implementation classes. But here's where you get stuck. In spring, you would use the @Transactional and the container injection (with our without java config) to configure your objects. In other words, you would provide specific technology choices in the form of annotations like @Transactional or @Autowired. Since scala and the cake pattern does not use byte code generation or proxies, you have to be a bit more explicit with the technology choices. But we do not want to bake in a specific technology at the service and DAO "interface" level.
So what we really need to do is reframe the standard service and DAO model so that is more flexible--which in this case means more composable.
In this case, we really need to not return the actual return value directly, but a function that returns the return value. This way, we can than wrap the functions and compose them together with others. For example, we can call the function asynchronously or synchronously.
The most obvious approach is to use the Reader monad, say from scalaz:
trait UserServiceComponent[T] {
val service: UserService
trait UserService {
def updateName(User: UserId, newName: String): Reader[T, Either[String, User]]
def findById(name: UserId): Reader[T, Option[User]]
}
}Because we did not want to specify the actual "value" in the reader that would be "injected" into the function, we parameterized the type. We could also have the UserService take a constructor parameter that contains an environment. So we need to think through this. But there are issues with parameterization, namely that the parameter would get carried throughout the API. And we do not want to lock us into the Reader monad either as that reduces flexibility. So trying to genericize our DAO structures using type parameters can work but we may want to try existential types as well so we can mix in the context. The service/DAO object can consume it as it sees fit. Since existential types are a form of cake layer, we are choosing cake over type parameterization. The cake layer, at least using self-types, can also help us ensure that the dependencies are always available within the context so it also helps us force not only a certain way of composing our service/DAO but it also helps with dependency management which is one of our objectives.
First let's take a quick look at a standard approach of using a cake layer to weave in the database needed to open a transaction/session.
#Remembering Slick Let's at how to provide some simple slick-specific parts to the service and DAO implementations. We saw in a previous blog that we could do something like:
/**
* If you only define queries you do not need a database you just need a profile.
* This profile is all that is need to mix in.
*/
trait ProfileContext {
/**
* To allow automatic query lifting with a Session: {{{ import profile.Implicit._ }}}
*/
val profile: BasicProfile
}
/**
* A cake layer with an abstract type members that says a database object is available
* and a driver profile. This
* allows you to "lift" queries for execution in the driver and to create
* transactions to wrap your queries. This corresponds to conceptually providing
* a spring @Transactional annotation, giving you control over units of work, and
* an JPA EntityManager so you can define and execute queries. This is one way to
* mix in the database so we can get access to objects that can provide us
* transactions. We could skip this trait and use the DataComponent trait.
*
* If you are using this component to
* open a transaction {{{database.withTx}}}, you are not using proxies or AOP to slice
* in transactional behavior like spring does. So you have to be explicit. The spring
* @Transactional attribute and the container wraps each method in a transaction
* automatically.
*
* This component allows us, in a totally type safe way, to provide access
* to the instance that allows us to run transactions. We could have
* abstracted the database away and just said there is an object there that
* provides a transaction method {{{withTx}}} using duck typing, but
* we just stuck in the member as the database object to make it apparent where
* it was coming from. We could also abstract the profile away. Since we include
* that mostly to lift our queries into the driver via implicits, we could
* just have provided some implicit def that did that as well. But again, it was
* easier just to provide the profile. You'll need to call {{{import profile.Implicit._}}}
* in your code so that {{{Query}}}s are lifted into the driver to call {{{.run}}}.
*/
trait DatabaseContext extends ProfileContext {
/**
* This is a def because using a val (can't use a var) forces
* us to have to define the object at compile time whereas most databases
* are configured and opened dynamically in an application based
* on configuration parameters (e.g. host name) you want to
* connect to. Always use defs for flexibility unless you know
* you can configure the object to be set at compile time. Many
* times you can, but for databases that must be "opened" after
* the program starts, you need to use a def.
*
* # syntax is used because these are existential types.
*/
def database: profile.simple.Database
}And then our service and DAO could be something like:
/**
* This layer adds a simple implementation. Since this layer
* uses database specific calls, it needs the database within scope
* in order to create a transaction.
*
* The implementation assumes some knowledge of how to create
* a transaction or how to create a session context for the lower level
* layers (the DAOs). By including a DatabaseContext we bake in knowledge on the Profile type and
* enforce the need for having a database and profile members so that
* the service layer can handle transactions (through the database object).
* We could really use any "object" that is specific to the technology.
* Strictly, the business
* layer should only have knowledge about creating a transaction but we
* want to remain flexible.
*
* The database is accessed merely to be able to open transactions
* which define units of work, and these should be managed in the
* business layer. Without bytecode generation or proxies as in spring, we cannot
* intercept the "methods" on the class and wrap them in the transaction
* automatically. But avoiding proxies and bytecode generation is kind of
* the point of static typing.
*
* We could also convert this to a class and
* take a constructor, or set the database as a "setter".
* We could fiddle with implicits (which is much harder in this
* scenario) as well. So if we dropped the type parameter, we would need to
* ensure that the injected DAOs (through setters) had the right type.
*
*/
trait UserServiceImpl extends UserServiceComponent {
self: UserDaoComponent with DatabaseContext =>
/**
* Override the "forced" val definition so that the type is more specific.
* You don't have to do this if the standard UserService interface
* is all that you need. You don't need the "override" here but we use
* it to be clear that its the same definition as in the supertrait.
*/
override val service: UserService
class UserService extends super.UserService {
import profile.simple._
/**
* Update the name and return the new user object. This is considered
* an Unit of Work for our application.
*/
def updateName(user: UserId, newName: String): Either[String, User] = {
database.withTx { implicit session =>
val x = CypherQuery("""start n=node(*) return n""")
x.run
val newUser: Option[User] = for {
user <- findById(user)
val changes = Map("name" -> newName)
newUser <- Option(user.copy(properties = user.properties ++ changes))
} yield newUser
// Only run for the side-effect of the update
newUser.foreach { v =>
val r = userDao.update(v)
v
}
newUser.toRight("No user " + user + " found")
}
}
/**
* This is more of a pass through.
*/
def findById(user: UserId): Option[User] = {
database.withTx { implicit session =>
userDao.findById(user)
}
}
/**
* Another service method that just shows you can add methods and that the
* abstract type member "service" is better off with the more specific
* UserService type declared in this UserService implementation. Instances
* accessing the "service" can see this method directly since the type
* is specific. Again, you may or may not want this in your design.
*/
def doSomethingTransactionalWithUser(user: UserId): Boolean =
database.withTx { implicit session =>
println("Doing something transactional with user: " + user)
true
}
}
}But this only helps with ensuring required dependencies are available (in this case a slick implementation) versus helping composability. What it does show us is that to get transactional behavior, we need a database-like object. Also, if, as in slick, queries are to be defined in the service, we would need a "profile" as well as a parameter. It's pretty easy to see that if we want to abstract away the specific technology choice but we need to introduce an abstraction at the "UserService" level and whatever we do introduce, needs to be technology agnostic.
So it looks like if we want to use a constructor parameter or cake layer, it will probably need to carry both the "database," the "profile," and something that defines a type for the service/DAO methods so it can return a function instead of a direct value.
#Improving Composability To improve composability, methods need to return functions instead of just raw values. If we did not do this, the imperative nature of the DAO would not allow us to compose a sequence of DAO calls or wrap functions within functions which is a more functional way of writing code.
We know that the implementations will need a technology-specific context to execute under--an environment. So we need an abstraction for the environment and an easy way to apply it. We have a bunch of choices:
- Put this technology-specific context at the UserDao trait level (and make UserDao a class) so that it becomes a constructor argument but then a new UserDao must be constructed each time you need to use the DAO. This restricts your choices of how to handle the technology specific component and could limit future design choices.
- Provide a type parameter could also be used but that sometimes makes inheritance more restricted.
- Use an abstract type member (the over all object is know an existential type) we also allow ourselves the ability to combine the context members across all components that are instantiated together to allow the context to satisfy multiple component context needs. But this sounds a bit too open ended.
- Use the Reader from scalaz or, if we wanted to to say a Keisli object, either of which could lift a function (A => M[B]) into an environment. But using Reader is rather restrictive actually. Perhaps someone wants to use their own Reader or equivalent in their functions.
So the options all look good, but we take some lessons from the typesafe slick driver. The slick drivers use a lifted-embedded type design to lift the queries (which are object types separate from the driver) into the driver for execution. It uses implicit defs that when brought into scope using import profile.simple._ lift the query object into the appropriate driver. That seems like a decent way to do this.
#Playing with a Cake Layer and a Lifting Layer and a Composable Abstraction
...Work in progress...