laristra / flecsi

A mirror of FleCSI's internal gitlab repository.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

feature/replication fails ghost_access test

jpietarilagraham opened this issue · comments

The ghosts are empty!

[jgraham@cn198 execution]$ !mpi
mpirun -n 2 ./ghost_access
WARNING: Using GASNet's mpi-conduit, which exists for portability convenience.
WARNING: This system appears to contain recognized network hardware: InfiniBand IBV
WARNING: which is supported by a GASNet native conduit, although
WARNING: it was not detected at configure time (missing drivers?)
WARNING: You should really use the high-performance native GASNet conduit
WARNING: if communication performance is at all important in this program run.
MPI Rank 0 maps to Legion Address Space 0
MPI Rank 1 maps to Legion Address Space 1
MPI Rank 0 maps to Legion Address Space 0
MPI Rank 1 maps to Legion Address Space 1
In specialization top-level-task init
In specialization top-level-task init
CREATE primary 0,0-152
CREATE primary 1,0-135
CREATE primary 0,0-152
CREATE primary 1,0-135
CREATE primary 0,0-127
CREATE primary 1,0-127
CREATE primary 0,0-127
CREATE primary 1,0-127
[0 - 14e0b09ce700] {3}{runtime}: flecsi_ispace 0 primary_lp 0 0,0 - 0,127
[0 - 14e0b09ce700] {3}{runtime}: flecsi_ispace 0 primary_lp 1 1,0 - 1,127
[1 - 146fb5f5d700] {3}{runtime}: flecsi_ispace 0 primary_lp 0 0,0 - 0,127
[1 - 146fb5f5d700] {3}{runtime}: flecsi_ispace 0 primary_lp 1 1,0 - 1,127
[0 - 14e0b09ce700] {3}{runtime}: flecsi_ispace 1 primary_lp 0 0,0 - 0,152
[0 - 14e0b09ce700] {3}{runtime}: flecsi_ispace 1 primary_lp 1 1,0 - 1,135
[1 - 146fb5f5d700] {3}{runtime}: flecsi_ispace 1 primary_lp 0 0,0 - 0,152
[1 - 146fb5f5d700] {3}{runtime}: flecsi_ispace 1 primary_lp 1 1,0 - 1,135
in driver
in driver
Rank 1 WRITING
Rank 0 WRITING
Rank 0 READING
Rank 0 exclusive 0 = 9
Rank 0 exclusive 1 = 10
Rank 0 exclusive 2 = 11
Rank 0 exclusive 3 = 12
Rank 0 exclusive 4 = 13
Rank 0 exclusive 5 = 14
Rank 0 exclusive 6 = 15
Rank 0 exclusive 7 = 25
Rank 0 exclusive 8 = 26
Rank 0 exclusive 9 = 27
Rank 0 exclusive 10 = 28
Rank 0 exclusive 11 = 29
Rank 0 exclusive 12 = 30
Rank 0 exclusive 13 = 31
Rank 0 exclusive 14 = 41
Rank 0 exclusive 15 = 42
Rank 0 exclusive 16 = 43
Rank 0 exclusive 17 = 44
Rank 0 exclusive 18 = 45
Rank 0 exclusive 19 = 46
Rank 0 exclusive 20 = 47
Rank 0 exclusive 21 = 57
Rank 0 exclusive 22 = 58
Rank 0 exclusive 23 = 59
Rank 0 exclusive 24 = 60
Rank 0 exclusive 25 = 61
Rank 0 exclusive 26 = 62
Rank 0 exclusive 27 = 63
Rank 0 exclusive 28 = 73
Rank 0 exclusive 29 = 74
Rank 0 exclusive 30 = 75
Rank 0 exclusive 31 = 76
Rank 0 exclusive 32 = 77
Rank 0 exclusive 33 = 78
Rank 0 exclusive 34 = 79
Rank 0 exclusive 35 = 89
Rank 0 exclusive 36 = 90
Rank 0 exclusive 37 = 91
Rank 0 exclusive 38 = 92
Rank 0 exclusive 39 = 93
Rank 0 exclusive 40 = 94
Rank 0 exclusive 41 = 95
Rank 0 exclusive 42 = 105
Rank 0 exclusive 43 = 106
Rank 0 exclusive 44 = 107
Rank 0 exclusive 45 = 108
Rank 0 exclusive 46 = 109
Rank 0 exclusive 47 = 110
Rank 0 exclusive 48 = 111
Rank 0 exclusive 49 = 121
Rank 0 exclusive 50 = 122
Rank 0 exclusive 51 = 123
Rank 0 exclusive 52 = 124
Rank 0 exclusive 53 = 125
Rank 0 exclusive 54 = 126
Rank 0 exclusive 55 = 127
Rank 0 exclusive 56 = 137
Rank 0 exclusive 57 = 138
Rank 0 exclusive 58 = 139
Rank 0 exclusive 59 = 140
Rank 0 exclusive 60 = 141
Rank 0 exclusive 61 = 142
Rank 0 exclusive 62 = 143
Rank 0 exclusive 63 = 153
Rank 0 exclusive 64 = 154
Rank 0 exclusive 65 = 155
Rank 0 exclusive 66 = 156
Rank 0 exclusive 67 = 157
Rank 0 exclusive 68 = 158
Rank 0 exclusive 69 = 159
Rank 0 exclusive 70 = 169
Rank 0 exclusive 71 = 170
Rank 0 exclusive 72 = 171
Rank 0 exclusive 73 = 172
Rank 0 exclusive 74 = 173
Rank 0 exclusive 75 = 174
Rank 0 exclusive 76 = 175
Rank 0 exclusive 77 = 185
Rank 0 exclusive 78 = 186
Rank 0 exclusive 79 = 187
Rank 0 exclusive 80 = 188
Rank 0 exclusive 81 = 189
Rank 0 exclusive 82 = 190
Rank 0 exclusive 83 = 191
Rank 0 exclusive 84 = 201
Rank 0 exclusive 85 = 202
Rank 0 exclusive 86 = 203
Rank 0 exclusive 87 = 204
Rank 0 exclusive 88 = 205
Rank 0 exclusive 89 = 206
Rank 0 exclusive 90 = 207
Rank 0 exclusive 91 = 217
Rank 0 exclusive 92 = 218
Rank 0 exclusive 93 = 219
Rank 0 exclusive 94 = 220
Rank 0 exclusive 95 = 221
Rank 0 exclusive 96 = 222
Rank 0 exclusive 97 = 223
Rank 0 exclusive 98 = 233
Rank 0 exclusive 99 = 234
Rank 0 exclusive 100 = 235
Rank 0 exclusive 101 = 236
Rank 0 exclusive 102 = 237
Rank 0 exclusive 103 = 238
Rank 0 exclusive 104 = 239
Rank 0 exclusive 105 = 249
Rank 0 exclusive 106 = 250
Rank 0 exclusive 107 = 251
Rank 0 exclusive 108 = 252
Rank 0 exclusive 109 = 253
Rank 0 exclusive 110 = 254
Rank 0 exclusive 111 = 255
Rank 0 shared 0 = 8
Rank 0 shared 1 = 24
Rank 0 shared 2 = 40
Rank 0 shared 3 = 56
Rank 0 shared 4 = 72
Rank 0 shared 5 = 88
Rank 0 shared 6 = 104
Rank 0 shared 7 = 120
Rank 0 shared 8 = 136
Rank 0 shared 9 = 152
Rank 0 shared 10 = 168
Rank 0 shared 11 = 184
Rank 0 shared 12 = 200
Rank 0 shared 13 = 216
Rank 0 shared 14 = 232
Rank 0 shared 15 = 248
Rank 0 ghost 0 = 0
Rank 0 ghost 1 = 0
Rank 0 ghost 2 = 0
Rank 0 ghost 3 = 0
Rank 0 ghost 4 = 0
Rank 0 ghost 5 = 0
Rank 0 ghost 6 = 0
Rank 0 ghost 7 = 0
Rank 0 ghost 8 = 0
Rank 0 ghost 9 = 0
Rank 0 ghost 10 = 0
Rank 0 ghost 11 = 0
Rank 0 ghost 12 = 0
Rank 0 ghost 13 = 0
Rank 0 ghost 14 = 0
Rank 0 ghost 15 = 0
/home/jgraham/github/flecsi/flecsi/execution/test/ghost_access_drivers.cc:197: Failure
Expected: cell_ID.ghost(index)
Which is: 0
To be equal to: ghost.id + cycle
Which is: 7
Rank 1 READING
Rank 1 exclusive 0 = 0
Rank 1 exclusive 1 = 1
Rank 1 exclusive 2 = 2
Rank 1 exclusive 3 = 3
Rank 1 exclusive 4 = 4
Rank 1 exclusive 5 = 5
Rank 1 exclusive 6 = 6
Rank 1 exclusive 7 = 16
Rank 1 exclusive 8 = 17
Rank 1 exclusive 9 = 18
Rank 1 exclusive 10 = 19
Rank 1 exclusive 11 = 20
Rank 1 exclusive 12 = 21
Rank 1 exclusive 13 = 22
Rank 1 exclusive 14 = 32
Rank 1 exclusive 15 = 33
Rank 1 exclusive 16 = 34
Rank 1 exclusive 17 = 35
Rank 1 exclusive 18 = 36
Rank 1 exclusive 19 = 37
Rank 1 exclusive 20 = 38
Rank 1 exclusive 21 = 48
Rank 1 exclusive 22 = 49
Rank 1 exclusive 23 = 50
Rank 1 exclusive 24 = 51
Rank 1 exclusive 25 = 52
Rank 1 exclusive 26 = 53
Rank 1 exclusive 27 = 54
Rank 1 exclusive 28 = 64
Rank 1 exclusive 29 = 65
Rank 1 exclusive 30 = 66
Rank 1 exclusive 31 = 67
Rank 1 exclusive 32 = 68
Rank 1 exclusive 33 = 69
Rank 1 exclusive 34 = 70
Rank 1 exclusive 35 = 80
Rank 1 exclusive 36 = 81
Rank 1 exclusive 37 = 82
Rank 1 exclusive 38 = 83
Rank 1 exclusive 39 = 84
Rank 1 exclusive 40 = 85
Rank 1 exclusive 41 = 86
Rank 1 exclusive 42 = 96
Rank 1 exclusive 43 = 97
Rank 1 exclusive 44 = 98
Rank 1 exclusive 45 = 99
Rank 1 exclusive 46 = 100
Rank 1 exclusive 47 = 101
Rank 1 exclusive 48 = 102
Rank 1 exclusive 49 = 112
Rank 1 exclusive 50 = 113
Rank 1 exclusive 51 = 114
Rank 1 exclusive 52 = 115
Rank 1 exclusive 53 = 116
Rank 1 exclusive 54 = 117
Rank 1 exclusive 55 = 118
Rank 1 exclusive 56 = 128
Rank 1 exclusive 57 = 129
Rank 1 exclusive 58 = 130
Rank 1 exclusive 59 = 131
Rank 1 exclusive 60 = 132
Rank 1 exclusive 61 = 133
Rank 1 exclusive 62 = 134
Rank 1 exclusive 63 = 144
Rank 1 exclusive 64 = 145
Rank 1 exclusive 65 = 146
Rank 1 exclusive 66 = 147
Rank 1 exclusive 67 = 148
Rank 1 exclusive 68 = 149
Rank 1 exclusive 69 = 150
Rank 1 exclusive 70 = 160
Rank 1 exclusive 71 = 161
Rank 1 exclusive 72 = 162
Rank 1 exclusive 73 = 163
Rank 1 exclusive 74 = 164
Rank 1 exclusive 75 = 165
Rank 1 exclusive 76 = 166
Rank 1 exclusive 77 = 176
Rank 1 exclusive 78 = 177
Rank 1 exclusive 79 = 178
Rank 1 exclusive 80 = 179
Rank 1 exclusive 81 = 180
Rank 1 exclusive 82 = 181
Rank 1 exclusive 83 = 182
Rank 1 exclusive 84 = 192
Rank 1 exclusive 85 = 193
Rank 1 exclusive 86 = 194
Rank 1 exclusive 87 = 195
Rank 1 exclusive 88 = 196
Rank 1 exclusive 89 = 197
Rank 1 exclusive 90 = 198
Rank 1 exclusive 91 = 208
Rank 1 exclusive 92 = 209
Rank 1 exclusive 93 = 210
Rank 1 exclusive 94 = 211
Rank 1 exclusive 95 = 212
Rank 1 exclusive 96 = 213
Rank 1 exclusive 97 = 214
Rank 1 exclusive 98 = 224
Rank 1 exclusive 99 = 225
Rank 1 exclusive 100 = 226
Rank 1 exclusive 101 = 227
Rank 1 exclusive 102 = 228
Rank 1 exclusive 103 = 229
Rank 1 exclusive 104 = 230
Rank 1 exclusive 105 = 240
Rank 1 exclusive 106 = 241
Rank 1 exclusive 107 = 242
Rank 1 exclusive 108 = 243
Rank 1 exclusive 109 = 244
Rank 1 exclusive 110 = 245
Rank 1 exclusive 111 = 246
Rank 1 shared 0 = 7
Rank 1 shared 1 = 23
Rank 1 shared 2 = 39
Rank 1 shared 3 = 55
Rank 1 shared 4 = 71
Rank 1 shared 5 = 87
Rank 1 shared 6 = 103
Rank 1 shared 7 = 119
Rank 1 shared 8 = 135
Rank 1 shared 9 = 151
Rank 1 shared 10 = 167
Rank 1 shared 11 = 183
Rank 1 shared 12 = 199
Rank 1 shared 13 = 215
Rank 1 shared 14 = 231
Rank 1 shared 15 = 247
Rank 1 ghost 0 = 0
Rank 1 ghost 1 = 0
Rank 1 ghost 2 = 0
Rank 1 ghost 3 = 0
Rank 1 ghost 4 = 0
Rank 1 ghost 5 = 0
Rank 1 ghost 6 = 0
Rank 1 ghost 7 = 0
Rank 1 ghost 8 = 0
Rank 1 ghost 9 = 0
Rank 1 ghost 10 = 0
Rank 1 ghost 11 = 0
Rank 1 ghost 12 = 0
Rank 1 ghost 13 = 0
Rank 1 ghost 14 = 0
Rank 1 ghost 15 = 0
/home/jgraham/github/flecsi/flecsi/execution/test/ghost_access_drivers.cc:197: Failure
Expected: cell_ID.ghost(index)
Which is: 0
To be equal to: ghost.id + cycle
Which is: 8
[==========] [==========] Running 1 test from 1 test case.
[----------] Global test environment set-up.
Running 1 test from 1 test case.
[----------] Global test environment set-up.
[----------] Global test environment tear-down
[----------] Global test environment tear-down
[==========] [==========] 1 test from 1 test case ran. (0 ms total)
[ PASSED ] 1 test from 1 test case ran. (0 ms total)
[ PASSED ] 1 test.
1 test.
[ FAILED ] 0 tests, listed below:

0 FAILED TESTS
[ FAILED ] 0 tests, listed below:

0 FAILED TESTS
[jgraham@cn198 execution]$

weird, I tried with replication-dependent-partition branch, and it works correct.
Rank 1 ghost 0 = 8
Rank 1 ghost 1 = 24
Rank 1 ghost 2 = 40
Rank 1 ghost 3 = 56
Rank 1 ghost 4 = 72
Rank 1 ghost 5 = 88
Rank 1 ghost 6 = 104
Rank 1 ghost 7 = 120
Rank 1 ghost 8 = 136
Rank 1 ghost 9 = 152
Rank 1 ghost 10 = 168
Rank 1 ghost 11 = 184
Rank 1 ghost 12 = 200
Rank 1 ghost 13 = 216
Rank 1 ghost 14 = 232
Rank 1 ghost 15 = 248
Rank 0 ghost 0 = 7
Rank 0 ghost 1 = 23
Rank 0 ghost 2 = 39
Rank 0 ghost 3 = 55
Rank 0 ghost 4 = 71
Rank 0 ghost 5 = 87
Rank 0 ghost 6 = 103
Rank 0 ghost 7 = 119
Rank 0 ghost 8 = 135
Rank 0 ghost 9 = 151
Rank 0 ghost 10 = 167
Rank 0 ghost 11 = 183
Rank 0 ghost 12 = 199
Rank 0 ghost 13 = 215
Rank 0 ghost 14 = 231
Rank 0 ghost 15 = 247

also tried replication branch, and it works correctly
Rank 0 ghost 0 = 7
Rank 0 ghost 1 = 23
Rank 0 ghost 2 = 39
Rank 0 ghost 3 = 55
Rank 0 ghost 4 = 71
Rank 0 ghost 5 = 87
Rank 0 ghost 6 = 103
Rank 0 ghost 7 = 119
Rank 0 ghost 8 = 135
Rank 0 ghost 9 = 151
Rank 0 ghost 10 = 167
Rank 0 ghost 11 = 183
Rank 0 ghost 12 = 199
Rank 0 ghost 13 = 215
Rank 0 ghost 14 = 231
Rank 0 ghost 15 = 247
Rank 1 ghost 0 = 8
Rank 1 ghost 1 = 24
Rank 1 ghost 2 = 40
Rank 1 ghost 3 = 56
Rank 1 ghost 4 = 72
Rank 1 ghost 5 = 88
Rank 1 ghost 6 = 104
Rank 1 ghost 7 = 120
Rank 1 ghost 8 = 136
Rank 1 ghost 9 = 152
Rank 1 ghost 10 = 168
Rank 1 ghost 11 = 184
Rank 1 ghost 12 = 200
Rank 1 ghost 13 = 216
Rank 1 ghost 14 = 232
Rank 1 ghost 15 = 248

I think there's some -D's I need when I run cmake?

I'm building flecsi w/ cmake .. -DENABLE_UNIT_TESTS=ON -DFLECSI_RUNTIME_MODEL=legion -DENABLE_COLORING=ON -DMAPPER_COMPACTION=OFF

flecsi is aac9bc5

flecsi-3rd-party is 9fd51b41565a0be27b3723b874c406b335f56220

legion is ba9042b0e5fd5ec2f283ad82b0e089d43d03a259

Can you give me more details on your build? I assume this is on darwin with:

  1. gcc/8.1.0 3) cmake/3.11.1
  2. mpich/3.2.1-gcc_8.1.0 4) boost/1.67.0_gcc-8.1

With Legion built Debug, and the options you provided:
cmake ../ -DFLECSI_RUNTIME_MODEL=legion -DENABLE_UNIT_TESTS=on -DENABLE_MPI=ON -DENABLE_PARMETIS=ON -DENABLE_COLORING=ON -DENABLE_MAPPER_COMPACTION=OFF -DCOMPACTED_STORAGE_SORT=OFF
ghost_access works

But, I get that dense_data fails!

(gdb) bt
#0 0x00001460180b656d in nanosleep () from /lib64/libc.so.6
#1 0x00001460180b6404 in sleep () from /lib64/libc.so.6
#2 0x0000146019cf49a8 in Realm::realm_freeze (signal=6)
at /home/jgraham/github/flecsi-third-party/legion/runtime/realm/runtime_impl.cc:131
#3
#4 0x0000146018027277 in raise () from /lib64/libc.so.6
#5 0x0000146018028968 in abort () from /lib64/libc.so.6
#6 0x0000146018020096 in assert_fail_base () from /lib64/libc.so.6
#7 0x0000146018020142 in assert_fail () from /lib64/libc.so.6
#8 0x000014601a9317c4 in Legion::Internal::Runtime::report_error_message (
id=id@entry=303,
file_name=file_name@entry=0x14601a9fa2c0 "/home/jgraham/github/flecsi-third-party/legion/runtime/legion/legion_tasks.cc", line=line@entry=1657,
message=message@entry=0x145fac552e00 "Projection region requirement 0 used in non-index space task flecsi::execution::init")
at /home/jgraham/github/flecsi-third-party/legion/runtime/legion/runtime.cc:21857
#9 0x000014601a6c40aa in Legion::Internal::TaskOp::perform_privilege_checks (
this=this@entry=0x145f980a56e0)
---Type to continue, or q to quit---
at /home/jgraham/github/flecsi-third-party/legion/runtime/legion/legion_tasks.cc:1654
#10 0x000014601a6db3d4 in Legion::Internal::IndividualTask::initialize_task (
this=this@entry=0x145f980a56e0, ctx=ctx@entry=0x145fa40024d0,
launcher=..., check_privileges=, track=track@entry=true)
at /home/jgraham/github/flecsi-third-party/legion/runtime/legion/legion_tasks.cc:5159
#11 0x000014601a5e93bb in Legion::Internal::ReplicateContext::execute_task (
this=0x145fa40024d0, launcher=...)
at /home/jgraham/github/flecsi-third-party/legion/runtime/legion/legion_context.cc:9845
#12 0x000014601a936069 in Legion::Internal::Runtime::execute_task (
this=, ctx=0x145fa40024d0, launcher=...)
at /home/jgraham/github/flecsi-third-party/legion/runtime/legion/runtime.cc:12521
#13 0x000014601a60aeba in Legion::Runtime::execute_task (this=,
ctx=, launcher=...)
at /home/jgraham/github/flecsi-third-party/legion/runtime/legion/legion.cc:5870
#14 0x0000000000595e15 in flecsi::execution::legion_execution_policy_t::execute
---Type to continue, or q to quit---
task
<(flecsi::execution::launch_type_t)0, 0, 6136001009106431597ul, void, std::tuple<flecsi::data_client_handle_base
_<flecsi::supplemental::test_mesh_2d_t, 1ul, flecsi::legion_data_client_handle_policy_t>, flecsi::accessor__<(flecsi::data::storage_label_type_t)1, unsigned long, 3ul, 3ul, 1ul> >, flecsi::data_client_handle_base__<flecsi::supplemental::test_mesh_2d_t, 0ul, flecsi::legion_data_client_handle_policy_t>&, flecsi::data::legion::dense_handle_t<unsigned long, 0ul, 0ul, 0ul>&>::execute (task_args=std::tuple containing = {...})
at /home/jgraham/github/flecsi/flecsi/execution/legion/execution_policy.h:465
#15 0x00000000005960cf in flecsi::execution::legion_execution_policy_t::execute_task<(flecsi::execution::launch_type_t)0, 0, 6136001009106431597ul, void, std::tuple<flecsi::data_client_handle_base__<flecsi::supplemental::test_mesh_2d_t, 1ul, flecsi::legion_data_client_handle_policy_t>, flecsi::accessor__<(flecsi::data::storage_label_type_t)1, unsigned long, 3ul, 3ul, 1ul> >, flecsi::data_client_handle_base__<flecsi::supplemental::test_mesh_2d_t, 0ul, flecsi::legion_data_client_handle_policy_t>&, flecsi::data::legion::dense_handle_t<unsigned long, 0ul, 0ul, 0ul>&> (args#0=warning: RTTI symbol not found for class 'flecsi::data_client_handle_base__<flecsi::supplemental::test_mesh_2d_t, 0ul, flecsi::legion_data_client_handle_policy_t>'
..., args#1=...)
at /home/jgraham/github/flecsi/flecsi/execution/legion/execution_policy.h:488
#16 0x0000000000596193 in flecsi::execution::task_interface__<flecsi::execution:---Type to continue, or q to quit---
:legion_execution_policy_t>::execute_task<(flecsi::execution::launch_type_t)0, 0, 6136001009106431597ul, void, std::tuple<flecsi::data_client_handle_base__<flecsi::supplemental::test_mesh_2d_t, 1ul, flecsi::legion_data_client_handle_policy_t>, flecsi::accessor__<(flecsi::data::storage_label_type_t)1, unsigned long, 3ul, 3ul, 1ul> >, flecsi::data_client_handle_base__<flecsi::supplemental::test_mesh_2d_t, 0ul, flecsi::legion_data_client_handle_policy_t>&, flecsi::data::legion::dense_handle_t<unsigned long, 0ul, 0ul, 0ul>&> (args#0=warning: RTTI symbol not found for class 'flecsi::data_client_handle_base__<flecsi::supplemental::test_mesh_2d_t, 0ul, flecsi::legion_data_client_handle_policy_t>'
..., args#1=...)
at /home/jgraham/github/flecsi/flecsi/execution/task.h:94
#17 0x000000000058e86d in flecsi::execution::driver (argc=1, argv=0x11c2450)
at /home/jgraham/github/flecsi/flecsi/execution/test/dense_data.cc:239
#18 0x000000000060b40c in flecsi::execution::runtime_driver (
task=0x145fa4001a00, regions=std::vector of length 0, capacity 0,
ctx=0x145fa40024d0, runtime=0x1289cd0)
at /home/jgraham/github/flecsi/flecsi/execution/legion/runtime_driver.cc:505
#19 0x000014601b34732f in Legion::LegionTaskWrapper::legion_task_wrapper<&flecsi::execution::runtime_driver> (args=0x145fa4004f60, arglen=8, userdata=0x0,
userlen=0, p=...)
at /home/jgraham/opt/flecsi/include/legion/legion.inl:8782
#20 0x0000146019cdf8d2 in Realm::LocalTaskProcessor::execute_task (
---Type to continue, or q to quit---
this=0x12875e0, func_id=, task_args=...)
at /home/jgraham/github/flecsi-third-party/legion/runtime/realm/bytearray.inl:58
#21 0x0000146019d14b7f in Realm::Task::execute_on_processor (
this=0x145fa4019eb0, p=...)
at /home/jgraham/github/flecsi-third-party/legion/runtime/realm/runtime_impl.h:317
#22 0x0000146019d14cb2 in Realm::UserThreadTaskScheduler::execute_task (
this=, task=)
at /home/jgraham/github/flecsi-third-party/legion/runtime/realm/tasks.cc:1084
#23 0x0000146019d16e0e in Realm::ThreadedTaskScheduler::scheduler_loop (
this=0x1287970)
at /home/jgraham/github/flecsi-third-party/legion/runtime/realm/tasks.cc:591
#24 0x0000146019d19439 in Realm::Thread::thread_entry_wrapper<Realm::ThreadedTaskScheduler, &Realm::ThreadedTaskScheduler::scheduler_loop> (
obj=)
at /home/jgraham/github/flecsi-third-party/legion/runtime/realm/threads.inl:128
---Type to continue, or q to quit---
#25 0x0000146019d1a5aa in Realm::UserThread::uthread_entry ()
at /home/jgraham/github/flecsi-third-party/legion/runtime/realm/threads.cc:981
#26 0x0000146018039030 in ?? () from /lib64/libc.so.6
#27 0x0000000000000000 in ?? ()
(gdb)

Yes, I noticed it and I get the similar error with flecsi_sp_burton_2d. I think they may come from the same bug. Here is what I get with Burton_2d.
[F0830 17:51:57 client.h:471] invalid index subspace
[F./flecsi_sp_burton_2d(cinch::log_message_t<bool ()>::~log_message_t()+0x69) [0x61af09]
./flecsi_sp_burton_2d(cinch::fatal_log_message_t::~fatal_log_message_t()+0x4a) [0x611f5a]
./flecsi_sp_burton_2d(flecsi::data_client_handle_base__<flecsi_sp::burton::burton_mesh__<2ul, false>, 0ul, flecsi::legion_data_client_handle_policy_t> flecsi::data::data_client_policy_handler__<flecsi::topology::mesh_topology__<flecsi_sp::burton::burton_types_t<2ul, false> > >::get_client_handle<flecsi_sp::burton::burton_mesh__<2ul, false>, 126879381284205ul, 207910823277ul>()+0xd55) [0x629f92]
./flecsi_sp_burton_2d(flecsi::data_client_handle_base__<flecsi_sp::burton::burton_mesh__<2ul, false>, 0ul, flecsi::legion_data_client_handle_policy_t> flecsi::data::data_client_interface__flecsi::data::legion_data_policy_t::get_client_handle<flecsi_sp::burton::burton_mesh__<2ul, false>, 126879381284205ul, 207910823277ul>()+0x18) [0x6211ce]
0830 17:51:57 ./flecsi_sp_burton_2d(flecsi::execution::driver(int, char**)+0x28) [0x6102ea]

At this line
https://github.com/laristra/flecsi/blob/feature/replication/flecsi/execution/legion/execution_policy.h#L466
we get [0 - 145fac7a2700] {5}{runtime}: [error 303] LEGION ERROR: Projection region requirement 0 used in non-index space task flecsi::execution::init
where init is from
https://github.com/laristra/flecsi/blob/feature/replication/flecsi/execution/test/dense_data.cc#L126-L141

@jpietarilagraham I believe that this is fixed in the replication_merge branch so I am closing it