As for the first question: "correct" in what sense? i.e. It depends on the problem you are modeling and therefore more details need to be provided.
softmax is not used as the activation function when the last layer has only one output unit. That's because softmax normalizes the output to make the sum of its elements be one, i.e. to resemble a probability distribution. Therefore, if you use it on a layer with only one output unit it would always have an output of 1. Instead, either linear (in case of regression, i.e. predicting real values) or sigmoid (in case of binary classification) is used. Additionally, commonly a Dense layer is used as the last layer which acts as the final regressor or classifier. For example:
model.add(LSTM(8, batch_input_shape=(None, 100, 4), return_sequences=True))
model.add(LeakyReLU())
model.add(LSTM(4, return_sequences=True))
model.add(LeakyReLU())
model.add(LSTM(1, return_sequences=False))
model.add(Dense(1, activation='sigmoid'))
As for the layers and number of units (according to the figure): it is a bit ambiguous, but I think there are three LSTM layers, the first one has 4 units, the second one has 8 units and the last one has 4 units. As for the final layer it seems to be a Dense layer. So the model would look like this (assuming LeakyReLU is applied on the output of LSTM layers):
model.add(LSTM(4, batch_input_shape=(None, 100, 4), return_sequences=True))
model.add(LeakyReLU())
model.add(LSTM(8, return_sequences=True))
model.add(LeakyReLU())
model.add(LSTM(4, return_sequences=False))
model.add(Dense(1, activation='sigmoid')) # or activation='linear' if it is a regression problem
As for using the LeakyReLU layer: I guess you are right that linear activation should be used as the activation of its previous layer (as also suggested here, though aDense layer has been used there). That's because by default the activation of LSTM layer is hyperbolic tangent (i.e. tanh) and therefore it squashes the outputs to the range [-1,1] which I think may not be efficient when you apply LeakyReLU on it; however, I am not sure about this since I am not completely familiar with leaky relu's practical and recommended usage.