summaryrefslogtreecommitdiff
path: root/training/dtrain/examples/parallelized/work/out.0.2
blob: 9c4b110b6014d9a871b04f9e169c494cfd36c6ae (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
Loading the LM will be faster if you build a binary file.
Reading ../standard/nc-wmt11.en.srilm.gz
----5---10---15---20---25---30---35---40---45---50---55---60---65---70---75---80---85---90---95--100
****************************************************************************************************
dtrain
Parameters:
                       k 100
                       N 4
                       T 1
           learning rate 0.0001
            error margin 1
                  l1 reg 0
            decoder conf 'cdec.ini'
                   input 'work/shard.0.0.in'
                  output 'work/weights.0.2'
              weights in 'work/weights.1'
(a dot per input)
Iteration #1 of 1.
 .... 3
WEIGHTS
              Glue = -0.44422
       WordPenalty = +0.1032
     LanguageModel = +0.66474
 LanguageModel_OOV = -0.62252
     PhraseModel_0 = -0.59993
     PhraseModel_1 = +0.78992
     PhraseModel_2 = +1.3149
     PhraseModel_3 = +0.21434
     PhraseModel_4 = -1.0174
     PhraseModel_5 = +0.02435
     PhraseModel_6 = -0.18452
       PassThrough = -0.65268
        ---
       1best avg score: 0.24722 (+0.24722)
 1best avg model score: 61.971
           avg # pairs: 2017.7
   non-0 feature count: 12
           avg list sz: 100
           avg f count: 10.42
(time 0.3 min, 6 s/S)

---
Best iteration: 1 [GOLD = 0.24722].
This took 0.3 min.