Front end architecture for CQRS
April 20, 2019
Many of us have been covered in CRUD at some point in time in our careers. I may do a post later talking about why CQRS and event sourced type architectures are superior and the future of large software systems. But for now, I'm going to talk about how and why we're building the front end the way we are.
One of the complexities associated with CQRS is that from a User's point of view everything is suppose to happen now. A common solution is optimistic updating, but this only really addresses one of the few problems associated with this model.
We're using React and a few libraries to create our front end. Redux and Redux-Saga are the core ones. The fundamental reason we're using these, is that (some of and hopefully, eventually all of) our back end is event based. Much like many of you, we have an old legacy system that we're trying to update and improve upon.
We have a couple fundamental principles we've come up with designing and redesigning this front end.
Write the front end as if the back end were written in a perfect world
A layer of abstraction prevents unnecessary coupling, and provides an interface you control.
Compress state and weirdness
Keep business logic and state in as few places as possible. This doesn't mean to have giant components, but make a few smart components, and many dumb components.
User events are the most basic building blocks
These events are fundamental to your domain and application. Your back end only supports some finite number of operations and features. Defining the user events first creates a source of truth. Then figure out how to tie different things together and adding back end features.
Now into the architecture and file structure! I'll show some simple example code for a system to get user preferences. We have 5 main folders that break up our front end:
Operations
- Operations are broken up into different domains for separation of concerns
- This is basically your event based API, that provides a layer of abstraction.
- Events contain all of the necessary data to execute.
- Everything in here is stateless and functional.
operations
└───User
│ │ actions.js
│ │ constants.js
│ │ saga.js
│ │ services.js
│ │
│ └───__tests__
│ │ actions.test.js
│ │ saga.test.js
│ │ ...
│
└───SomeOtherSubdomain
│ ...
actions.js
Contain the user events for the subdomain.
import { createAction } from 'redux-actions'
import {
USER_PREFERENCES_REQUISITION
} from './constants'
export const userPreferencesRequisition = createAction(
USER_PREFERENCES_REQUISITION.ACTION,
userId => ({ userId })
)
export const userPreferencesRequisitionSuccess = createAction(
USER_PREFERENCES_REQUISITION.SUCCESS,
preferences => ({ preferences })
)
export const userPreferencesRequisitionFailure = createAction(
USER_PREFERENCES_REQUISITION.FAILURE
)
constants.js
Contain the constants for the actions.
import { defineAction } from 'redux-define'
export const namespace = 'preferences'
export const USER_PREFERENCES_REQUISITION = defineAction(
'USER_PREFERENCES_REQUISITION',
['SUCCESS', 'FAILURE'],
namespace
)
saga.js
The saga catches all of the user events defined, and sends them to the backend to be handled. For event sourced systems polling usually is done here, looping until the backend has fully processed the request.
import { takeLatest, call, put } from 'redux-saga/effects'
import {
USER_PREFERENCES_REQUISITION
} from './constants'
import {
userPreferencesRequisitionSuccess,
userPreferencesRequisitionFailure,
} from './actions'
import {
getUserPreferences,
} from './services'
export default function* userPreferencesSaga() {
yield takeLatest(USER_PREFERENCES_REQUISITION.ACTION, handleUserPreferencesRequest)
}
export function* handleUserPreferencesRequest({ payload }) {
try {
const preferences = yield call(getUserPreferences, payload.userId)
yield put(userPreferencesRequisitionSuccess(preferences))
} catch (error) {
yield put(userPreferencesRequisitionFailure(error))
}
}
services.js
Remember our axiom to compress weirdness? Well this is where it all gets compressed. If you're writing your app coupled to your current backend there's all sorts of potential problems. The services becomes the place where massaging of data and some business logic is performed to create a clean as possible interface for your app.
import axios from 'axios'
export async function getUserPreferences(userId) {
const endpoint = someEndpoint
const body = {userId}
const requestConfig = {
headers: {
'Content-Type': 'application/json'
}
}
const preferences = await axios
.post(endpoint, body, requestConfig)
.then(res => res.data)
return preferences
}
There's a number of things that this setup provides:
- A layer of abstraction your backend API.
- An event based API with error handling.
- If your backend is event sourced, events can be mapped across the entire stack.
- Events have meaning and potentially context associated with them. A catch all reducer or saga can easily dump these event to an analytics platform.
Pages
- Are used to display data and drop the user into workflows.
- Are inherently navigational and informational.
pages
└───SomePage
│ │ index.js
│ │ reducer.js
│ │ selectors.js
│ │
│ └───components
│ │ │ ...
│ │
│ └───__tests__
│ │ ...
│
└───SomeOtherPage
│ ...
In CQRS you're fundamentally separating read and write data flows. Pages are more for reading data, workflows are for writing data. Pages have their own set of data held in state needed for display. The reducer and selector pair handles this.
index.js
Index usually is a large component orchestrating a number of 'dumb' components pulled from the components directory.
import React, { PureComponent } from 'react'
import PropTypes from 'prop-types'
import { connect } from 'react-redux'
import { userPreferencesRequisition } from '../../operations/User/actions'
import { selectPreferences, selectUser } from './selectors'
import SomePreferenceComponent from './components/SomePreferenceComponent'
export const styles = theme => ({
page: {
padding: theme.unit.spacing,
...
}
})
export class SomePage extends PureComponent {
state = {
preferences: null,
}
componentDidMount() {
const { user, loadPreferences } = this.props
loadPreferences(user)
}
render() {
const { classes, preferences } = this.props
return (
<div className={classes.page}>
{preferences.map(preference =>
<SomePreferenceComponent preference={preference}/>
)}
</div>
)
}
}
export function mapStateToProps(state) {
return {
user: selectUser(state),
preferences: selectPreferences(state),
}
}
export function mapDispatchToProps(dispatch) {
return {
loadPreferences: user => {
dispatch(userPreferencesRequisition(user.id))
}
}
}
SomePage.propTypes = {
loadPreferences: PropTypes.func.isRequired,
user: PropTypes.func.isRequired,
preferences: PropTypes.func
}
export default connect(
mapStateToProps,
mapDispatchToProps
)(withStyles(styles)(SomePage))
reducer.js
The reducer catches different events to populate state for the page.
import {
USER_PREFERENCES_REQUISITION
} from '../../operations/User/constants'
const initialState = {
open: false,
error: '',
}
export default function (state = initialState, action) {
switch (action.type) {
case USER_PREFERENCES_REQUISITION:
return {
...state,
loading: true
}
case USER_PREFERENCES_REQUISITION.SUCCESS:
return {
...state,
preferences: action.payload.preferences,
loading: false
}
case USER_PREFERENCES_REQUISITION.FAILURE:
return {
...state,
error: action.error.message
}
default:
return state
}
}
selector.js
The selector basically just namespaces the data for the reducer to prevent redux stuff stepping on each other.
export const selectPreferences = state => state.somePage.preferences
export const selectUser = state => state.somePage.user
Workflows
- Are used to add, change, or interact with data.
- Should be able to open with a single event.
- Should not break without data.
workflows
└───SomeWorkflow
│ │ index.js
│ │ actions.js
│ │ constants.js
│ │ reducer.js
│ │ selectors.js
│ │
│ └───components
│ │ ...
│
└───SomeOtherPage
│ ...
Workflows are basically a page with a bit more functionality. Pages are static, but workflows are less so. We've defined events so that every workflow can be opened with an event containing some data.
Keeping workflows small and independent allows the creation of very complex workflows from simpler ones. For example, you have some form that the user must populate. This gets sent to the backend as an add or creation of this object. If you have the ability to edit that object, that same form may be used, but the same flow could be used as well. Instead of thinking about a create form and an edit form, you 'create' the object as empty and hand it off to the edit form for the user to enter their data.
Workflows can flow into each other creating complex flows, but overdoing this can be painful. If you have to have a bunch of conditional logic built to figure out what to render, you're probably better off leaving them separate. Trying to force a square cube into a circular hole doesn't work so well.
Shared
- Components generally get made for a specific page or workflow, and then if said component is ever needed by something outside of its folder, thats when it gets moved out and into shared.
- Components used by multiple pages or workflows.
Utils
- Functions and helpers used in multiple components or operations.
- This is basically the same rules as shared, but for functions instead of components.
Notes
- Talk about something
- And something else
As the front end gets more and more complex to deliver more value to users, it requires more planning and architecture. Making user events the basic building block of your application is the least volatile option in this crazy world of tech. Knowing that requirements and technologies will change is a fact of life. Building applications to last in the chaos is what drives true long term user value. Having clean architecture, a little bit of experience, and the right building blocks and any developer can build their software to last.
In my next post I hope to tackle some of the caveats of backend services built in a CQRS or event sourced model. Who knows when that will available? Not me, that's for sure.