Selamat pagi sahabat semua … tentunya dalam keadaan sehat wal’afiat bukan ?
Kali ini Whitecyber Team mendapatkan order untuk membuatkan Program Artificial Intelligence menggunakan bahasa pemprograman Python, Program ini dibuat untuk membuat pengambilan keputusan oleh software itu sendiri berdasarkan proses Deep Learning yang sudah kita ajarkan kepada program tersebut. Proses belajar Software Machine Learning ini berasal dari Database “Stunting” yang sudah disediakan oleh Customer. Database ini berisi data tentang stunting, dan kita diminta untuk membuat Artificial Intelligence yang berfungsi untuk Memprediksi hasil dari database stunting tersebut.
Artificial Intelligence (AI)
Artificial intelligence (AI) atau kecerdasan buatan adalah bidang ilmu dan teknologi yang berusaha menciptakan mesin atau sistem yang dapat meniru kemampuan manusia dalam berpikir, belajar, dan beradaptasi. AI memiliki berbagai macam aplikasi dan manfaat bagi kehidupan manusia, seperti bidang kesehatan, pendidikan, industri, hiburan, dan lain-lain. Namun, AI juga menimbulkan beberapa tantangan dan risiko, seperti masalah etika, privasi, keamanan, dan dampak sosial-ekonomi.
AI dapat dibagi menjadi dua jenis utama, yaitu AI lemah (weak AI) dan AI kuat (strong AI). AI lemah adalah sistem yang dirancang untuk melakukan tugas-tugas tertentu dengan baik, tetapi tidak memiliki kesadaran atau pemahaman tentang dirinya sendiri atau lingkungannya. Contoh AI lemah adalah mesin pencari, asisten virtual, pengenalan suara, dan lain-lain. AI kuat adalah sistem yang mampu melakukan segala hal yang dapat dilakukan oleh manusia, termasuk memiliki kesadaran, emosi, kreativitas, dan moralitas. Contoh AI kuat adalah robot humanoid, superinteligensi, dan lain-lain. AI kuat masih merupakan tujuan jangka panjang yang belum tercapai oleh ilmuwan.
AI bekerja dengan menggunakan algoritma-algoritma yang dapat memproses data dan informasi secara cepat dan akurat. Salah satu cabang AI yang paling berkembang saat ini adalah machine learning (pembelajaran mesin), yaitu kemampuan sistem untuk belajar dari data tanpa perlu diprogram secara eksplisit. Machine learning dapat dibagi menjadi tiga jenis utama, yaitu supervised learning (pembelajaran dengan pengawasan), unsupervised learning (pembelajaran tanpa pengawasan), dan reinforcement learning (pembelajaran dengan penguatan). Machine learning memungkinkan sistem untuk mengenali pola-pola, membuat prediksi, mengoptimalkan keputusan, dan meningkatkan kinerja.
AI merupakan salah satu bidang ilmu dan teknologi yang paling menjanjikan dan menantang di abad ke-21. AI memiliki potensi untuk membawa manfaat besar bagi kemanusiaan, tetapi juga harus diimbangi dengan tanggung jawab etis dan sosial. Oleh karena itu, penting bagi kita untuk memahami apa itu AI, bagaimana cara kerjanya, apa saja manfaat dan risikonya, serta bagaimana cara menggunakannya dengan bijak dan bertanggung jawab.
Machine Learning
Machine Learning adalah cabang ilmu komputer yang mempelajari bagaimana membuat sistem yang dapat belajar dari data dan pengalaman. Machine Learning memiliki berbagai aplikasi di bidang seperti pengenalan wajah, rekomendasi produk, analisis sentimen, deteksi penyakit, dan lain-lain. Machine Learning dapat dibagi menjadi tiga kategori utama: supervised learning, unsupervised learning, dan reinforcement learning.
Supervised learning adalah proses pembelajaran di mana sistem diberi data yang sudah dilabeli dengan kelas atau nilai yang diinginkan. Tujuannya adalah untuk membuat sistem yang dapat memprediksi kelas atau nilai dari data baru yang belum dilabeli. Contoh metode supervised learning adalah regresi linear, klasifikasi logistik, k-nearest neighbors, decision tree, dan neural network.
Unsupervised learning adalah proses pembelajaran di mana sistem diberi data yang tidak dilabeli. Tujuannya adalah untuk menemukan pola atau struktur tersembunyi dalam data. Contoh metode unsupervised learning adalah clustering, principal component analysis, association rule mining, dan autoencoder.
Reinforcement learning adalah proses pembelajaran di mana sistem belajar melalui interaksi dengan lingkungan dan mendapatkan umpan balik berupa reward atau punishment. Tujuannya adalah untuk membuat sistem yang dapat mengoptimalkan perilaku atau kebijakan untuk mencapai tujuan tertentu. Contoh metode reinforcement learning adalah Q-learning, policy gradient, deep Q-network, dan actor-critic.
Machine Learning adalah bidang yang sangat menarik dan berkembang pesat. Machine Learning dapat membantu manusia dalam menyelesaikan masalah yang kompleks dan menciptakan inovasi baru. Namun, Machine Learning juga memiliki tantangan dan etika yang perlu diperhatikan, seperti kualitas data, interpretabilitas model, keamanan sistem, privasi pengguna, dan dampak sosial.
Backpropagation
Backpropagation adalah algoritma yang digunakan untuk melatih jaringan saraf tiruan dengan cara menyesuaikan bobot koneksi antara neuron berdasarkan kesalahan yang dihasilkan oleh jaringan. Algoritma ini bekerja dengan cara menghitung gradien fungsi biaya terhadap bobot koneksi dan kemudian mengurangi bobot koneksi sebesar nilai gradien yang dikalikan dengan laju pembelajaran. Dengan demikian, algoritma ini mencoba untuk meminimalkan fungsi biaya dengan cara mencari titik minimum lokal atau global.
Backpropagation terdiri dari dua tahap, yaitu tahap maju dan tahap mundur. Pada tahap maju, jaringan saraf tiruan menerima input dan menghasilkan output dengan cara menghitung aktivasi neuron di setiap lapisan secara berurutan. Pada tahap mundur, jaringan saraf tiruan menghitung kesalahan output dengan cara membandingkan output yang dihasilkan dengan output yang diharapkan. Kemudian, kesalahan output ini dipropagasikan ke lapisan sebelumnya dengan cara mengalikan kesalahan dengan turunan fungsi aktivasi dan bobot koneksi. Selanjutnya, bobot koneksi di setiap lapisan diperbarui dengan cara mengurangi bobot koneksi sebesar nilai gradien yang dikalikan dengan laju pembelajaran.
Backpropagation memiliki beberapa keunggulan, antara lain adalah cepat, sederhana, mudah diprogram, tidak memerlukan pengetahuan awal mengenai jaringan saraf tiruan, dan merupakan metode standar yang umumnya bekerja dengan baik. Namun, algoritma ini juga memiliki beberapa kelemahan, antara lain adalah sensitif terhadap pilihan laju pembelajaran, dapat terjebak di titik minimum lokal atau lembah datar, membutuhkan banyak data dan waktu pelatihan, dan dapat mengalami masalah overfitting atau underfitting.
Projects :
- Bahasa Pemprograman : Python
- Database : Microsoft Exell
- Environtment : Colab notebook
- Resource : https://github.com/whitecyber-faris/project-ai-1
Code :
import pandas as pd
df = pd.read_csv("drive/MyDrive/Colab Notebooks/datasetstunting13.csv")
df.info()
.
Hasil :
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 156 entries, 0 to 155
Data columns (total 14 columns):
# Column Non-Null Count Dtype
--- ------ -------------- -----
0 JenisKelamin 156 non-null int64
1 Umur 156 non-null int64
2 BeratLahir 156 non-null int64
3 TinggiLahir 156 non-null int64
4 AsiEksklusif 156 non-null int64
5 LilaHamil 156 non-null int64
6 TinggiIbu 156 non-null int64
7 RiwayatKehamilanPrematur 156 non-null int64
8 UsiaIbuHamil 156 non-null int64
9 PendidikanIbu 156 non-null int64
10 Pendapatan 156 non-null int64
11 Berat 156 non-null int64
12 Tinggi 156 non-null int64
13 StatusStunting 156 non-null int64
dtypes: int64(14)
memory usage: 17.2 KB
.
Code :
x = df.iloc[:,0:-1].values
y = df.iloc[:,-1].values
print(x.shape)
.
Hasil :
(156, 13)
.
Code :
import sklearn
from sklearn import datasets
# load dataset iris
iris = datasets.load_iris()
# pisahkan atribut dan label pada dataset
z = iris.data
q = iris.target
from sklearn.model_selection import train_test_split
# membagi datasets menjadi training dan testing
xtrain,xtes,ytrain,ytes=train_test_split(x,y,train_size=0.8,random_state=42)
import tensorflow as tf
model = tf.keras.Sequential()
model.add(tf.keras.layers.Dense(units=9,activation='relu', input_shape=(13,)))
model.add(tf.keras.layers.Dense(units=1))
model.compile(loss='mean_absolute_error', optimizer=tf.keras.optimizers.Adam(0.01))
model.fit(xtrain, ytrain, epochs=300, validation_data=(xtes, ytes))
.
Hasil :
Epoch 1/300
4/4 [==============================] - 0s 26ms/step - loss: 0.1631 - val_loss: 0.1369
Epoch 2/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1649 - val_loss: 0.1563
Epoch 3/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1789 - val_loss: 0.1825
Epoch 4/300
4/4 [==============================] - 0s 11ms/step - loss: 0.2014 - val_loss: 0.2057
Epoch 5/300
4/4 [==============================] - 0s 11ms/step - loss: 0.2005 - val_loss: 0.1378
Epoch 6/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1741 - val_loss: 0.1531
Epoch 7/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1715 - val_loss: 0.1372
Epoch 8/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1707 - val_loss: 0.1581
Epoch 9/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1741 - val_loss: 0.1364
Epoch 10/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1754 - val_loss: 0.1514
Epoch 11/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1807 - val_loss: 0.1690
Epoch 12/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1709 - val_loss: 0.1338
Epoch 13/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1609 - val_loss: 0.1720
Epoch 14/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1802 - val_loss: 0.1368
Epoch 15/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1624 - val_loss: 0.1377
Epoch 16/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1639 - val_loss: 0.1572
Epoch 17/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1808 - val_loss: 0.1354
Epoch 18/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1724 - val_loss: 0.1423
Epoch 19/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1580 - val_loss: 0.1515
Epoch 20/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1678 - val_loss: 0.1351
Epoch 21/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1728 - val_loss: 0.1928
Epoch 22/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1980 - val_loss: 0.1467
Epoch 23/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1689 - val_loss: 0.1433
Epoch 24/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1587 - val_loss: 0.1660
Epoch 25/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1748 - val_loss: 0.1464
Epoch 26/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1644 - val_loss: 0.1427
Epoch 27/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1667 - val_loss: 0.1451
Epoch 28/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1737 - val_loss: 0.1768
Epoch 29/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1816 - val_loss: 0.1340
Epoch 30/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1900 - val_loss: 0.2324
Epoch 31/300
4/4 [==============================] - 0s 11ms/step - loss: 0.2326 - val_loss: 0.1341
Epoch 32/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1966 - val_loss: 0.1992
Epoch 33/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1879 - val_loss: 0.1491
Epoch 34/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1656 - val_loss: 0.1462
Epoch 35/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1938 - val_loss: 0.1841
Epoch 36/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1733 - val_loss: 0.1405
Epoch 37/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1614 - val_loss: 0.1334
Epoch 38/300
4/4 [==============================] - 0s 16ms/step - loss: 0.1576 - val_loss: 0.1459
Epoch 39/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1660 - val_loss: 0.1372
Epoch 40/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1771 - val_loss: 0.1501
Epoch 41/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1844 - val_loss: 0.1554
Epoch 42/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1758 - val_loss: 0.1310
Epoch 43/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1677 - val_loss: 0.1453
Epoch 44/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1684 - val_loss: 0.1345
Epoch 45/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1596 - val_loss: 0.1587
Epoch 46/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1637 - val_loss: 0.1480
Epoch 47/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1594 - val_loss: 0.1405
Epoch 48/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1666 - val_loss: 0.1545
Epoch 49/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1724 - val_loss: 0.1369
Epoch 50/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1584 - val_loss: 0.1363
Epoch 51/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1609 - val_loss: 0.1318
Epoch 52/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1570 - val_loss: 0.1321
Epoch 53/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1620 - val_loss: 0.1413
Epoch 54/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1626 - val_loss: 0.1511
Epoch 55/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1817 - val_loss: 0.1578
Epoch 56/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1926 - val_loss: 0.2618
Epoch 57/300
4/4 [==============================] - 0s 11ms/step - loss: 0.2126 - val_loss: 0.1809
Epoch 58/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1821 - val_loss: 0.1372
Epoch 59/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1664 - val_loss: 0.1367
Epoch 60/300
4/4 [==============================] - 0s 17ms/step - loss: 0.1688 - val_loss: 0.1471
Epoch 61/300
4/4 [==============================] - 0s 17ms/step - loss: 0.1913 - val_loss: 0.3001
Epoch 62/300
4/4 [==============================] - 0s 20ms/step - loss: 0.2649 - val_loss: 0.2772
Epoch 63/300
4/4 [==============================] - 0s 16ms/step - loss: 0.2561 - val_loss: 0.2179
Epoch 64/300
4/4 [==============================] - 0s 17ms/step - loss: 0.2376 - val_loss: 0.1383
Epoch 65/300
4/4 [==============================] - 0s 16ms/step - loss: 0.1967 - val_loss: 0.2000
Epoch 66/300
4/4 [==============================] - 0s 21ms/step - loss: 0.1972 - val_loss: 0.1456
Epoch 67/300
4/4 [==============================] - 0s 16ms/step - loss: 0.1951 - val_loss: 0.2027
Epoch 68/300
4/4 [==============================] - 0s 25ms/step - loss: 0.1863 - val_loss: 0.1650
Epoch 69/300
4/4 [==============================] - 0s 23ms/step - loss: 0.1912 - val_loss: 0.1366
Epoch 70/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1706 - val_loss: 0.1675
Epoch 71/300
4/4 [==============================] - 0s 15ms/step - loss: 0.1792 - val_loss: 0.1309
Epoch 72/300
4/4 [==============================] - 0s 15ms/step - loss: 0.1743 - val_loss: 0.1458
Epoch 73/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1634 - val_loss: 0.1319
Epoch 74/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1710 - val_loss: 0.1420
Epoch 75/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1677 - val_loss: 0.1333
Epoch 76/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1836 - val_loss: 0.2088
Epoch 77/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1983 - val_loss: 0.1336
Epoch 78/300
4/4 [==============================] - 0s 21ms/step - loss: 0.1688 - val_loss: 0.1528
Epoch 79/300
4/4 [==============================] - 0s 25ms/step - loss: 0.1728 - val_loss: 0.1527
Epoch 80/300
4/4 [==============================] - 0s 21ms/step - loss: 0.1712 - val_loss: 0.1503
Epoch 81/300
4/4 [==============================] - 0s 16ms/step - loss: 0.1600 - val_loss: 0.1564
Epoch 82/300
4/4 [==============================] - 0s 22ms/step - loss: 0.1729 - val_loss: 0.1391
Epoch 83/300
4/4 [==============================] - 0s 22ms/step - loss: 0.1709 - val_loss: 0.1868
Epoch 84/300
4/4 [==============================] - 0s 21ms/step - loss: 0.2018 - val_loss: 0.1284
Epoch 85/300
4/4 [==============================] - 0s 16ms/step - loss: 0.1664 - val_loss: 0.1574
Epoch 86/300
4/4 [==============================] - 0s 15ms/step - loss: 0.1663 - val_loss: 0.1316
Epoch 87/300
4/4 [==============================] - 0s 15ms/step - loss: 0.1623 - val_loss: 0.1595
Epoch 88/300
4/4 [==============================] - 0s 16ms/step - loss: 0.1877 - val_loss: 0.1402
Epoch 89/300
4/4 [==============================] - 0s 15ms/step - loss: 0.2072 - val_loss: 0.2510
Epoch 90/300
4/4 [==============================] - 0s 15ms/step - loss: 0.2306 - val_loss: 0.2401
Epoch 91/300
4/4 [==============================] - 0s 14ms/step - loss: 0.2776 - val_loss: 0.2044
Epoch 92/300
4/4 [==============================] - 0s 15ms/step - loss: 0.2780 - val_loss: 0.1295
Epoch 93/300
4/4 [==============================] - 0s 14ms/step - loss: 0.2301 - val_loss: 0.1888
Epoch 94/300
4/4 [==============================] - 0s 15ms/step - loss: 0.1990 - val_loss: 0.2332
Epoch 95/300
4/4 [==============================] - 0s 17ms/step - loss: 0.2010 - val_loss: 0.2028
Epoch 96/300
4/4 [==============================] - 0s 19ms/step - loss: 0.1945 - val_loss: 0.1298
Epoch 97/300
4/4 [==============================] - 0s 15ms/step - loss: 0.1816 - val_loss: 0.1597
Epoch 98/300
4/4 [==============================] - 0s 16ms/step - loss: 0.1696 - val_loss: 0.1525
Epoch 99/300
4/4 [==============================] - 0s 20ms/step - loss: 0.1688 - val_loss: 0.1299
Epoch 100/300
4/4 [==============================] - 0s 21ms/step - loss: 0.1736 - val_loss: 0.1447
Epoch 101/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1681 - val_loss: 0.1461
Epoch 102/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1705 - val_loss: 0.1398
Epoch 103/300
4/4 [==============================] - 0s 15ms/step - loss: 0.1734 - val_loss: 0.1407
Epoch 104/300
4/4 [==============================] - 0s 16ms/step - loss: 0.1681 - val_loss: 0.1697
Epoch 105/300
4/4 [==============================] - 0s 17ms/step - loss: 0.2042 - val_loss: 0.2337
Epoch 106/300
4/4 [==============================] - 0s 15ms/step - loss: 0.1852 - val_loss: 0.1288
Epoch 107/300
4/4 [==============================] - 0s 21ms/step - loss: 0.1762 - val_loss: 0.1864
Epoch 108/300
4/4 [==============================] - 0s 17ms/step - loss: 0.1979 - val_loss: 0.1704
Epoch 109/300
4/4 [==============================] - 0s 21ms/step - loss: 0.1879 - val_loss: 0.1428
Epoch 110/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1712 - val_loss: 0.1434
Epoch 111/300
4/4 [==============================] - 0s 18ms/step - loss: 0.1714 - val_loss: 0.1330
Epoch 112/300
4/4 [==============================] - 0s 16ms/step - loss: 0.1608 - val_loss: 0.1294
Epoch 113/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1606 - val_loss: 0.1462
Epoch 114/300
4/4 [==============================] - 0s 15ms/step - loss: 0.1672 - val_loss: 0.1270
Epoch 115/300
4/4 [==============================] - 0s 15ms/step - loss: 0.1565 - val_loss: 0.1340
Epoch 116/300
4/4 [==============================] - 0s 15ms/step - loss: 0.1597 - val_loss: 0.1271
Epoch 117/300
4/4 [==============================] - 0s 16ms/step - loss: 0.1575 - val_loss: 0.1443
Epoch 118/300
4/4 [==============================] - 0s 16ms/step - loss: 0.1818 - val_loss: 0.1287
Epoch 119/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1690 - val_loss: 0.1290
Epoch 120/300
4/4 [==============================] - 0s 15ms/step - loss: 0.1632 - val_loss: 0.1973
Epoch 121/300
4/4 [==============================] - 0s 15ms/step - loss: 0.2042 - val_loss: 0.1349
Epoch 122/300
4/4 [==============================] - 0s 15ms/step - loss: 0.1662 - val_loss: 0.1735
Epoch 123/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1798 - val_loss: 0.1923
Epoch 124/300
4/4 [==============================] - 0s 23ms/step - loss: 0.1838 - val_loss: 0.1333
Epoch 125/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1818 - val_loss: 0.1269
Epoch 126/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1819 - val_loss: 0.1789
Epoch 127/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1761 - val_loss: 0.1346
Epoch 128/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1726 - val_loss: 0.1382
Epoch 129/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1690 - val_loss: 0.1433
Epoch 130/300
4/4 [==============================] - 0s 16ms/step - loss: 0.1687 - val_loss: 0.1830
Epoch 131/300
4/4 [==============================] - 0s 16ms/step - loss: 0.1710 - val_loss: 0.1288
Epoch 132/300
4/4 [==============================] - 0s 20ms/step - loss: 0.1639 - val_loss: 0.1277
Epoch 133/300
4/4 [==============================] - 0s 21ms/step - loss: 0.1571 - val_loss: 0.1407
Epoch 134/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1803 - val_loss: 0.1315
Epoch 135/300
4/4 [==============================] - 0s 16ms/step - loss: 0.1676 - val_loss: 0.1524
Epoch 136/300
4/4 [==============================] - 0s 26ms/step - loss: 0.1711 - val_loss: 0.1269
Epoch 137/300
4/4 [==============================] - 0s 21ms/step - loss: 0.1855 - val_loss: 0.1714
Epoch 138/300
4/4 [==============================] - 0s 24ms/step - loss: 0.1914 - val_loss: 0.1270
Epoch 139/300
4/4 [==============================] - 0s 22ms/step - loss: 0.1801 - val_loss: 0.1957
Epoch 140/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1883 - val_loss: 0.2451
Epoch 141/300
4/4 [==============================] - 0s 11ms/step - loss: 0.2207 - val_loss: 0.2506
Epoch 142/300
4/4 [==============================] - 0s 11ms/step - loss: 0.2168 - val_loss: 0.1607
Epoch 143/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1989 - val_loss: 0.1467
Epoch 144/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1842 - val_loss: 0.1626
Epoch 145/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1709 - val_loss: 0.1241
Epoch 146/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1595 - val_loss: 0.1485
Epoch 147/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1673 - val_loss: 0.1273
Epoch 148/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1667 - val_loss: 0.1430
Epoch 149/300
4/4 [==============================] - 0s 18ms/step - loss: 0.1622 - val_loss: 0.1366
Epoch 150/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1770 - val_loss: 0.1471
Epoch 151/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1739 - val_loss: 0.1290
Epoch 152/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1704 - val_loss: 0.1303
Epoch 153/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1625 - val_loss: 0.1282
Epoch 154/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1588 - val_loss: 0.1344
Epoch 155/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1624 - val_loss: 0.1236
Epoch 156/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1759 - val_loss: 0.1342
Epoch 157/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1704 - val_loss: 0.1531
Epoch 158/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1644 - val_loss: 0.1225
Epoch 159/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1564 - val_loss: 0.1284
Epoch 160/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1685 - val_loss: 0.1259
Epoch 161/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1725 - val_loss: 0.1945
Epoch 162/300
4/4 [==============================] - 0s 11ms/step - loss: 0.2090 - val_loss: 0.1332
Epoch 163/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1830 - val_loss: 0.1961
Epoch 164/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1971 - val_loss: 0.2646
Epoch 165/300
4/4 [==============================] - 0s 11ms/step - loss: 0.2049 - val_loss: 0.2009
Epoch 166/300
4/4 [==============================] - 0s 10ms/step - loss: 0.2068 - val_loss: 0.1301
Epoch 167/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1731 - val_loss: 0.1471
Epoch 168/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1653 - val_loss: 0.1245
Epoch 169/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1649 - val_loss: 0.1379
Epoch 170/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1584 - val_loss: 0.1252
Epoch 171/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1602 - val_loss: 0.1235
Epoch 172/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1552 - val_loss: 0.1446
Epoch 173/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1699 - val_loss: 0.1251
Epoch 174/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1670 - val_loss: 0.1653
Epoch 175/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1663 - val_loss: 0.1260
Epoch 176/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1567 - val_loss: 0.1403
Epoch 177/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1602 - val_loss: 0.1232
Epoch 178/300
4/4 [==============================] - 0s 22ms/step - loss: 0.1622 - val_loss: 0.1732
Epoch 179/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1735 - val_loss: 0.1230
Epoch 180/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1678 - val_loss: 0.1638
Epoch 181/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1754 - val_loss: 0.1209
Epoch 182/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1655 - val_loss: 0.1465
Epoch 183/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1771 - val_loss: 0.1231
Epoch 184/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1650 - val_loss: 0.1449
Epoch 185/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1619 - val_loss: 0.1654
Epoch 186/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1845 - val_loss: 0.1408
Epoch 187/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1730 - val_loss: 0.1444
Epoch 188/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1612 - val_loss: 0.1340
Epoch 189/300
4/4 [==============================] - 0s 17ms/step - loss: 0.1628 - val_loss: 0.1340
Epoch 190/300
4/4 [==============================] - 0s 10ms/step - loss: 0.1617 - val_loss: 0.1254
Epoch 191/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1591 - val_loss: 0.1234
Epoch 192/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1658 - val_loss: 0.1928
Epoch 193/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1838 - val_loss: 0.1224
Epoch 194/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1575 - val_loss: 0.1351
Epoch 195/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1562 - val_loss: 0.1239
Epoch 196/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1606 - val_loss: 0.1475
Epoch 197/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1699 - val_loss: 0.1252
Epoch 198/300
4/4 [==============================] - 0s 17ms/step - loss: 0.1618 - val_loss: 0.1385
Epoch 199/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1574 - val_loss: 0.1286
Epoch 200/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1704 - val_loss: 0.1409
Epoch 201/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1644 - val_loss: 0.1351
Epoch 202/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1684 - val_loss: 0.1507
Epoch 203/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1834 - val_loss: 0.1280
Epoch 204/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1730 - val_loss: 0.1986
Epoch 205/300
4/4 [==============================] - 0s 11ms/step - loss: 0.2183 - val_loss: 0.2254
Epoch 206/300
4/4 [==============================] - 0s 10ms/step - loss: 0.2161 - val_loss: 0.1433
Epoch 207/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1924 - val_loss: 0.1560
Epoch 208/300
4/4 [==============================] - 0s 18ms/step - loss: 0.1760 - val_loss: 0.1388
Epoch 209/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1577 - val_loss: 0.1414
Epoch 210/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1592 - val_loss: 0.1343
Epoch 211/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1572 - val_loss: 0.1287
Epoch 212/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1751 - val_loss: 0.1440
Epoch 213/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1763 - val_loss: 0.1766
Epoch 214/300
4/4 [==============================] - 0s 12ms/step - loss: 0.2034 - val_loss: 0.2404
Epoch 215/300
4/4 [==============================] - 0s 13ms/step - loss: 0.2098 - val_loss: 0.1604
Epoch 216/300
4/4 [==============================] - 0s 18ms/step - loss: 0.1610 - val_loss: 0.1238
Epoch 217/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1588 - val_loss: 0.1538
Epoch 218/300
4/4 [==============================] - 0s 15ms/step - loss: 0.1698 - val_loss: 0.1457
Epoch 219/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1718 - val_loss: 0.1354
Epoch 220/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1619 - val_loss: 0.1624
Epoch 221/300
4/4 [==============================] - 0s 17ms/step - loss: 0.1646 - val_loss: 0.1445
Epoch 222/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1801 - val_loss: 0.1497
Epoch 223/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1743 - val_loss: 0.1975
Epoch 224/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1968 - val_loss: 0.2008
Epoch 225/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1833 - val_loss: 0.1565
Epoch 226/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1747 - val_loss: 0.1355
Epoch 227/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1678 - val_loss: 0.1245
Epoch 228/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1678 - val_loss: 0.1615
Epoch 229/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1777 - val_loss: 0.1283
Epoch 230/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1677 - val_loss: 0.1519
Epoch 231/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1661 - val_loss: 0.1286
Epoch 232/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1631 - val_loss: 0.1357
Epoch 233/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1702 - val_loss: 0.1459
Epoch 234/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1733 - val_loss: 0.1322
Epoch 235/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1658 - val_loss: 0.1705
Epoch 236/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1817 - val_loss: 0.1280
Epoch 237/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1655 - val_loss: 0.1354
Epoch 238/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1577 - val_loss: 0.1477
Epoch 239/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1572 - val_loss: 0.1386
Epoch 240/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1694 - val_loss: 0.1465
Epoch 241/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1641 - val_loss: 0.1693
Epoch 242/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1732 - val_loss: 0.1457
Epoch 243/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1620 - val_loss: 0.1481
Epoch 244/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1682 - val_loss: 0.1254
Epoch 245/300
4/4 [==============================] - 0s 16ms/step - loss: 0.1590 - val_loss: 0.1551
Epoch 246/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1603 - val_loss: 0.1234
Epoch 247/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1530 - val_loss: 0.1249
Epoch 248/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1517 - val_loss: 0.1262
Epoch 249/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1523 - val_loss: 0.1246
Epoch 250/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1635 - val_loss: 0.1230
Epoch 251/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1645 - val_loss: 0.1802
Epoch 252/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1769 - val_loss: 0.1271
Epoch 253/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1666 - val_loss: 0.1659
Epoch 254/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1658 - val_loss: 0.2260
Epoch 255/300
4/4 [==============================] - 0s 13ms/step - loss: 0.2025 - val_loss: 0.1845
Epoch 256/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1819 - val_loss: 0.1320
Epoch 257/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1608 - val_loss: 0.1365
Epoch 258/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1566 - val_loss: 0.1298
Epoch 259/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1556 - val_loss: 0.1276
Epoch 260/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1574 - val_loss: 0.1356
Epoch 261/300
4/4 [==============================] - 0s 16ms/step - loss: 0.1627 - val_loss: 0.1231
Epoch 262/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1582 - val_loss: 0.1855
Epoch 263/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1811 - val_loss: 0.1576
Epoch 264/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1913 - val_loss: 0.1358
Epoch 265/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1682 - val_loss: 0.1289
Epoch 266/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1553 - val_loss: 0.1302
Epoch 267/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1537 - val_loss: 0.1306
Epoch 268/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1545 - val_loss: 0.1276
Epoch 269/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1561 - val_loss: 0.1285
Epoch 270/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1561 - val_loss: 0.1300
Epoch 271/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1585 - val_loss: 0.1435
Epoch 272/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1740 - val_loss: 0.1623
Epoch 273/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1756 - val_loss: 0.1679
Epoch 274/300
4/4 [==============================] - 0s 14ms/step - loss: 0.1766 - val_loss: 0.1314
Epoch 275/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1637 - val_loss: 0.1593
Epoch 276/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1751 - val_loss: 0.1337
Epoch 277/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1602 - val_loss: 0.1323
Epoch 278/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1568 - val_loss: 0.1695
Epoch 279/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1726 - val_loss: 0.1248
Epoch 280/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1556 - val_loss: 0.1260
Epoch 281/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1550 - val_loss: 0.1234
Epoch 282/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1563 - val_loss: 0.1265
Epoch 283/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1677 - val_loss: 0.1839
Epoch 284/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1887 - val_loss: 0.1249
Epoch 285/300
4/4 [==============================] - 0s 12ms/step - loss: 0.2016 - val_loss: 0.1849
Epoch 286/300
4/4 [==============================] - 0s 12ms/step - loss: 0.2000 - val_loss: 0.2305
Epoch 287/300
4/4 [==============================] - 0s 12ms/step - loss: 0.2151 - val_loss: 0.1392
Epoch 288/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1643 - val_loss: 0.1386
Epoch 289/300
4/4 [==============================] - 0s 17ms/step - loss: 0.1707 - val_loss: 0.1262
Epoch 290/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1578 - val_loss: 0.1558
Epoch 291/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1660 - val_loss: 0.1278
Epoch 292/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1670 - val_loss: 0.1667
Epoch 293/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1761 - val_loss: 0.1806
Epoch 294/300
4/4 [==============================] - 0s 11ms/step - loss: 0.2072 - val_loss: 0.1338
Epoch 295/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1814 - val_loss: 0.1530
Epoch 296/300
4/4 [==============================] - 0s 12ms/step - loss: 0.1599 - val_loss: 0.1532
Epoch 297/300
4/4 [==============================] - 0s 19ms/step - loss: 0.1582 - val_loss: 0.1323
Epoch 298/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1707 - val_loss: 0.1596
Epoch 299/300
4/4 [==============================] - 0s 11ms/step - loss: 0.1671 - val_loss: 0.1783
Epoch 300/300
4/4 [==============================] - 0s 13ms/step - loss: 0.1751 - val_loss: 0.1195
<keras.callbacks.History at 0x7fa18c56f790>
.
Code :
model.evaluate(xtes, ytes)
.
Hasil :
1/1 [==============================] - 0s 31ms/step - loss: 0.1195
0.11945655941963196
.
Code :
hasil = model.predict(xtes)
import matplotlib.pyplot as plt
a=range(len(ytes))
plt.scatter(a,hasil,color='blue')
plt.scatter(a,ytes,color='red')
plt.title("Hasil Prediksi Stunting terhadap Data Aktual")
plt.xlabel("Data Tes")
plt.ylabel("Status Stunting")
plt.show()
.
Hasil :
