Skip to content
Snippets Groups Projects
Commit 4caed60d authored by friebolin's avatar friebolin
Browse files

Add demo

parent 6e7cacdf
No related branches found
No related tags found
No related merge requests found
......@@ -174,13 +174,13 @@ pip install -r requirements.txt
🚀 Launch our application by following the steps below:
[welche argumente genau?]
```bash
./main.py <COMMAND> <ARGUMENTS>...
```
For `<COMMAND>` you must enter one of the commands you find in the list below, where you can also find an overview about necessary `<ARGUMENTS>`.
For `<COMMAND>` you must enter one of the commands you find in the list below, where you can also find an overview about possible `<ARGUMENTS>`.
ℹ️ The icons indicate if a command is mandatory (🔛). 🍸 indicates that this command is mandatory for *MixUp*, 🌐 describes mandatory commands for *TMix*.
| Command | Functionality | Arguments |
| ------- | ------------- |-----------|
......@@ -197,11 +197,11 @@ For `<COMMAND>` you must enter one of the commands you find in the list below, w
|**`-lrtwo`**/**`--second_learning_rate`**| Separate learning rate for multi layer perceptron.|Default is `None`.|
|**`--mlp`**| Whether or not to use two layer MLP as classifier.| |
|🔛 **`-rs`**/**`--random_seed`**|Random seed for initialization of the model.|Default is $42$.|
|🔛 **`-sd`**/**`--save_directory`**|This option specifies the destination directory for the output results of the run.||
|**`-sd`**/**`--save_directory`**|This option specifies the destination directory for the output results of the run.||
|**`-msp`**/**`--model_save_path`**|This option specifies the destination directory for saving the model.|We recommend saving models in [Code/saved_models](Code/saved_models).|
|**`--masking`**|Whether or not to mask the target word.||
|🌐 **`--mixlayer`**| Specify in which `layer` the interpolation takes place. Only select one layer at a time. | Choose from ${0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11}$ |
|🍸, 🌐 **`-lambda`**/**`--lambda_value`**|Speficies the lambda value for interpolation of *MixUp* and *TMix*|Default is $0.4$, `type=float`|
|🍸, 🌐 **`-lambda`**/**`--lambda_value`**|Speficies the lambda value for interpolation of *MixUp* and *TMix*|Choose any value between $0$ and $1$, `type=float`|
| <center>| **MixUp** specific </center>||
|🍸 **`-mixup`**/**`--mix_up`**| Whether or not to use *MixUp*. If yes, please specify `lambda` and `-mixepoch`| |
|🍸 **`-mixepoch`**/**`--mixepoch`**|Specifies the epoch(s) in which to apply *MixUp*.|Default is `None`|
......@@ -215,10 +215,20 @@ For `<COMMAND>` you must enter one of the commands you find in the list below, w
|🔛 **`-tb`**/**`--test_batch_size`**|Specifies the batch size for the test process.|Default is $16$.|
extra: BT and inference
📝 If you want to use our *backtranslation* code, you must execute the following:
```bash
python3 Code/backtranslate.py
```
🎥 If you want to see a demo of our model, you can enter your own sentence and let the model predict if a target word is used in its `literal` or `non-literal` sense:
```bash
python3 inference.py
```
<img src="documentation/images/demo.png" width="80%" height="80%">
[ADD screenshot of demo?]
***
## 🏯 Code-Structure <a name="code-structure"></a>
......
documentation/images/demo.png

1.13 MiB

0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment