Consider that we are syncing large numbers of issues from Jira DC to Jira Cloud.
This involves preloading issue to the cloud instance via CSV
Then using Bulk-Connect and Bulk-Exalate to sync 400-500 records at a time.
However, once the records have been connected/exalated, we need to
keep those connected/exalated records updated.
sync newly created issues to the cloud.
I suspect I can create a trigger with the following JQL, but I want to ensure that this will only sync connected issues and newly created issues and will NOT sync any changes to older issues that are waiting to be bulk-connected:
project = "MYPROJECT" and (issue in under_sync() or created >= 2021-10-17)
Hello, Kevin Ketchum
Another solution for your problem would be instructing exalate to use imported issues instead of creating new ones on the target end (so then you wouldn’t have to ignore any issues).
One way to achieve that is to assign the issue.id property.
Please, insert this snippet at the beginning of your “incoming sync” script:
if(firstSync) {
def jc = new JiraClient(httpClient)
def method = "GET"
def path = "/rest/api/2/issue/${replica.key}"
def jIssue = jc.http(
method,
path,
[:],
null,
[:]
) { response ->
if (response.code >= 300 && response.code != 404) {
throw new com.exalate.api.exception.IssueTrackerException("Failed to perform the request $method $path (status ${response.code}), and body was: \n\"$response.body\"\nPlease contact Exalate Support: ".toString() + response.body)
}
if (response.code == 404) {
return null
}
def r = response.body as String
def js = new groovy.json.JsonSlurper()
def rJ = js.parseText(r)
rJ
}
if (jIssue != null && jIssue.fields.summary == replica.summary) {
issue.id = jIssue.id as String
// debug.error("bingo!")
}
// debug.error("jIssue=$jIssue")
}
And it’s going to:
lookup the issue with the same issue key as on the source
if the issue has the same summary as on the source, ask Exalate not to create a new issue, but rather connect to that existing one