DEV Community

Junissen
Junissen

Posted on

Machine language in combat

Humanization of computer functions and patterns makes it possible to develop new methods. For example, creating a projected "conductor" of code.

up_1 = UpSampling2D(2, interpolation='bilinear')(pool_4) 
conc_1 = Concatenate()([conv_4_2, up_1]) 

conv_up_1_1 = Conv2D(256, (3, 3), padding='same')(conc_1) 
conv_up_1_1 = Activation('relu')(conv_up_1_1)

conv_up_1_2 = Conv2D(256, (3, 3), padding='same')(conv_up_1_1)
conv_up_1_2 = Activation('relu')(conv_up_1_2)
Enter fullscreen mode Exit fullscreen mode

Convolutions and concatenators form a control block responsible for the formation of a neural network. A similar thing is implemented in the open stack - Kubernetes. It implements the distribution of functions between services.

conv_up_4_2 = Conv2D(1, (3, 3), padding='same')(conv_up_4_1) 
result = Activation('sigmoid')(conv_up_4_2)
Enter fullscreen mode Exit fullscreen mode

Connecting to the source server is also a common task for ML and Kubernetes. Code and open source software are hard to compare, but the management skill is obvious!

It will be useful for developers to see not only algorithms and formulas, but also open technologies that replace them.

adam = keras.optimizers.Adam(lr=0.0001, beta_1=0.9, beta_2=0.999, epsilon=1e-08, decay=0.0)

model.compile(adam, 'binary_crossentropy')
Enter fullscreen mode Exit fullscreen mode

Optimization and cross-entropy functions are excellent assistants in managing the development of ML. They organize the sequence of actions of the neural network model.

Optimization and cross-entropy functions are excellent assistants in managing the development of ML. They organize the sequence of actions of the neural network model.

pred = model.predict(x) - It is also useful to predict the outcome of a neural network.

Top comments (0)