일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | 7 |
8 | 9 | 10 | 11 | 12 | 13 | 14 |
15 | 16 | 17 | 18 | 19 | 20 | 21 |
22 | 23 | 24 | 25 | 26 | 27 | 28 |
29 | 30 | 31 |
Tags
- Query
- aqqle
- vavr
- dbeaver
- api cache
- Aggregation
- JPA
- java
- TSLA
- Docker
- Elastic
- aggs
- API
- request cache
- Elasticsearch
- Analyzer
- NORI
- 양자컴퓨터
- Cache
- java crawler
- IONQ
- mysql
- 테슬라
- elasticsearch cache
- 아이온큐
- file download
- KNN
- redis
- Selenium
- ann
Archives
- Today
- Total
아빠는 개발자
[es] elasticsearch index copy 를 mac 에서 본문
728x90
반응형
인덱스 복사
# -*- coding: utf-8 -*-
import time
import json
import datetime as dt
from datetime import datetime
from elasticsearch import Elasticsearch
from elasticsearch.helpers import bulk
import requests
import ssl
import urllib3
from time import sleep
from urllib import parse
from concurrent.futures import ThreadPoolExecutor, ProcessPoolExecutor
print(ssl.OPENSSL_VERSION)
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
def create_index(INDEX, ORIGIN):
indices = prd_client.indices.get_alias(ORIGIN)
first_key = next(iter(indices))
setting = prd_client.indices.get_settings(index=ORIGIN)
mapping = prd_client.indices.get_mapping(index=ORIGIN)
settings = setting[first_key]
new_setting = {}
new_setting["number_of_shards"] = settings["settings"]["index"]["number_of_shards"]
new_setting["number_of_replicas"] = settings["settings"]["index"]["number_of_replicas"]
new_setting["analysis"] = settings["settings"]["index"]["analysis"]
new_setting["number_of_replicas"] = settings["settings"]["index"]["number_of_replicas"]
body = {}
body["settings"] = new_setting
body["mappings"] = mapping[first_key]["mappings"]
return dev_client.indices.create(index=INDEX, body=body)
def process_doo(INDEX, ORIGIN):
create = create_index(INDEX, ORIGIN)
print(create)
response = prd_client.search(
index=ORIGIN,
scroll=_KEEP_ALIVE_LIMIT,
body={
"size": 2000,
"query": {
"match_all": {}
},
"sort": [
{
"itemNo": {
"order": "desc"
}
}
]
}
)
sid = response['_scroll_id']
fetched = len(response['hits']['hits'])
requests = []
for doc in response['hits']['hits']:
requests.append({
'_index': INDEX,
'_source': doc['_source']
})
bulk(dev_client, requests)
requests = []
while (fetched > 0):
response = prd_client.scroll(scroll_id=sid, scroll=_KEEP_ALIVE_LIMIT)
fetched = len(response['hits']['hits'])
for doc in response['hits']['hits']:
print(doc['_source']['itemNm'])
requests.append({
'_index': INDEX,
'_source': doc['_source']
})
bulk(dev_client, requests)
requests =[]
if __name__ == '__main__':
_KEEP_ALIVE_LIMIT = '30m'
INDEX_HYPER = "hyper-item"
INDEX_DS = "ds-item"
INDEX_EXP = "exp-item"
EASY_HYPER = "local-prd-hyper-item"
EASY_DS = "local-prd-ds-item"
EASY_EXP = "local-prd-exp-item"
prd_client = Elasticsearch("https://계정:비번@도메인:포트/", timeout=30, max_retries=10, retry_on_timeout=True, ca_certs=False,
verify_certs=False)
dev_client = Elasticsearch("https://계정:비번@도메인:포트/", timeout=30, max_retries=10, retry_on_timeout=True, ca_certs=False,
verify_certs=False)
process_doo(EASY_HYPER, INDEX_HYPER)
process_doo(EASY_EXP, INDEX_EXP)
process_doo(EASY_DS, INDEX_DS)
print("End.")
LaunchDaemons 등록
이동 cd /Library/LaunchDaemons
LaunchDaemons 에 파일 생성이지만 권한이 귀찮으니 cp -af 로 이미 있는 파일 복사 sudo vi com.search.indexer.plist
금요일 17시 실행
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>com.search.indexer</string>
<key>ProgramArguments</key>
<array>
<string>/Users/doo/shell/prd_index_copy.sh</string>
</array>
<key>StandardOutPath</key>
<string>/Users/doo/shell/log/logfile.log</string>
<key>StandardErrorPath</key>
<string>/Users/doo/shell/log/errorfile.log</string>
<key>StartCalendarInterval</key>
<dict>
<key>Weekday</key>
<integer>5</integer>
<key>Hour</key>
<integer>17</integer>
<key>Minute</key>
<integer>0</integer>
</dict>
</dict>
</plist>
권한 대개 root:wheel이 적절한 소유권
sudo chown root:wheel /Library/LaunchDaemons/com.example.myscript.plist
로드 및 실행: 설정된 PLIST 파일을 로드하여 Launchd가 해당 스크립트를 실행할 수 있도록 합니다. 터미널에서 다음과 같이 입력합니다
sudo launchctl load /Library/LaunchDaemons/com.search.indexer.plist
728x90
반응형
'Elastic > elasticsearch' 카테고리의 다른 글
[es] nested 구조에서 aggregation 하기 - 성능테스트 (0) | 2024.05.02 |
---|---|
[es] nested 구조에서 aggregation 하기 (1) | 2024.04.30 |
[es] Hot Threads API (0) | 2024.02.24 |
[es] Full-cluster restart and rolling restart (0) | 2024.02.04 |
[es] multi_match 쿼리와 Lucene 쿼리 구조 (1) | 2023.12.23 |