Update README.md
Browse files
README.md
CHANGED
@@ -12,17 +12,6 @@ tags:
|
|
12 |
|
13 |
# **Image-Deep-Fake-Detector**
|
14 |
|
15 |
-
```
|
16 |
-
Classification report:
|
17 |
-
|
18 |
-
precision recall f1-score support
|
19 |
-
|
20 |
-
Real 0.9933 0.9937 0.9935 4761
|
21 |
-
Fake 0.9937 0.9933 0.9935 4760
|
22 |
-
|
23 |
-
accuracy 0.9935 9521
|
24 |
-
macro avg 0.9935 0.9935 0.9935 9521
|
25 |
-
weighted avg 0.9935 0.9935 0.9935 9521
|
26 |
```
|
27 |
|
28 |
The **precision score** is a key metric to evaluate the performance of a deep fake detector. Precision is defined as:
|
@@ -33,14 +22,7 @@ The **precision score** is a key metric to evaluate the performance of a deep fa
|
|
33 |
|
34 |
It indicates how well the model avoids false positives, which in the context of a deep fake detector means it measures how often the "Fake" label is correctly identified without mistakenly classifying real content as fake.
|
35 |
|
36 |
-
|
37 |
-
|
38 |
-
- **Real:** 0.9933
|
39 |
-
- **Fake:** 0.9937
|
40 |
-
- **Macro average:** 0.9935
|
41 |
-
- **Weighted average:** 0.9935
|
42 |
-
|
43 |
-
### Demo Inference:
|
44 |
|
45 |

|
46 |
|
|
|
12 |
|
13 |
# **Image-Deep-Fake-Detector**
|
14 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
15 |
```
|
16 |
|
17 |
The **precision score** is a key metric to evaluate the performance of a deep fake detector. Precision is defined as:
|
|
|
22 |
|
23 |
It indicates how well the model avoids false positives, which in the context of a deep fake detector means it measures how often the "Fake" label is correctly identified without mistakenly classifying real content as fake.
|
24 |
|
25 |
+
# Demo Inference:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
26 |
|
27 |

|
28 |
|