묻고 답해요
164만명의 커뮤니티!! 함께 토론해봐요.
인프런 TOP Writers
-
미해결파이썬 알고리즘 문제풀이 입문(코딩테스트 대비)
체크 배열 함수 만드는데 질문있습니다.
선생님께서 2차원 체크 배열 만드실때 항상 ch=[[0] *n for _ in range(n)] 과 같이 만드시는데, 왜 ch=[[0]*n]*n 으로 만들면 안될까요 예를 들어 n이 3일때 ch=[[0] *3 for _ in range(3)] print(ch) => [[0, 0, 0], [0, 0, 0], [0, 0, 0]] ch2=[[0]*3]*3 print(ch2) => [[0, 0, 0], [0, 0, 0], [0, 0, 0]] print값은 둘다 동일한데 ch[0][2]=1 ch2[0][2]=1 을 넣으면 print(ch) =>[[0,0,1],[0,0,0],[0,0,0]] ch2=[[0]*3]*3 print(ch2) =>[[0,0,1],[0,0,1],[0,0,1]] 이렇게 결과가도출되네요.. 어떻게 다른지 알려주세요 ㅠㅠ
-
미해결[2026년 출제기준] 웹디자인개발기능사 실기시험 완벽 가이드
border 디자인 및 공지사항 디자인
1) 가로 네비게이션에서 서브메뉴 a태그 갯수로 인해 이렇게 되는 부분은 어떻게 해결하면 좋을까요? 저부분에 border가 없습니다. 2) .btn span에 margin-bottom:-5px; 이라 입력했는데 위의 이미지처럼 span보다 아래 보더가 위로 올라오는데, 뭐가 잘못된걸까요? 다시 검토해보아도 잘못된 부분을 못찾겠습니다
-
미해결자바 ORM 표준 JPA 프로그래밍 - 기본편
55초 '플러시 발생' 관련 문의 드립니다
55초 '플러시 발생' 관련 문의 드립니다. 기존 강의 슬라이드에 그려지고 설명된 바로는, persist()호출 되거나 영속 상태의 entity에 변경이 발생할 때, 영속성컨텍스트 1차 캐시 Entity가 업데이트 되고 동시에 지연 SQL 저장소에 추가된다고 설명해 주셨습니다. 즉, flush() 호출 시에는 단순히 이미 저장 된 batch SQL들을 Query하고 DB transaction을 commit한 뒤 1차 캐시 스냅샷을 업데이트할 거라 예상했습니다. 그런데 이번 강의에서는 em.flush() '호출 시'에 '변경 감지' 및 '수정된 엔티티 쓰기 지연 SQL 저장소에 등록'을 수행한다고 설명해 주셨네요.. 타이밍에 대한 설명이 상이한데, 어느 쪽이 맞는 설명인지 문의 드립니다..
-
미해결[C#과 유니티로 만드는 MMORPG 게임 개발 시리즈] Part3: 유니티 엔진
6:50 설명부분 질문(시간 연산 질문)
강의를 듣다가 궁금한 부분이 생겨서 질문합니다! float deltaTime = 0; void ExplodeAfter4Secod() { deltaTime += Time.deltaTime; if(deltaTime >= 4) { // 로직 } 이렇게 코드를 작성하여 실행을 시키면 컴퓨터가 매프레임마다 deltaTime에 Time.deltaTime을 더해서 if문으로 확인을하고 다시 시간을 더해서 if문으로 확인을하고 이러한 일련의 작업들이 무식하고 비효율적이다 => 그래서 시간 매니저를 만들어서 시간매니저에게 "4초후에 무엇인가를 하고싶어!"라고 던져주면 시간매니저가 4초가되었을때 알려주어 해당 로직을 실행하게하면, 매프레임마다 매프레임마다 시간을 더해서 if문 체크를 하는 낭비를 생략할 수 있다. 즉, 중앙에서 관리를 하면 낭비를 줄일 수 있다라고 설명을 하셨는데 1. "4초후에 무엇인가를 하고싶어!"라고 시간매니저(중앙)에 던져주어 매니저가 시간을 샌다음에 결국에 다시 해당 함수에 "4초가 지났다!"라고 알려주어 해당함수의 로직이 실행이되는것같은데.. 매니저에게 "4초후에 무엇인가를 하고싶어!" 라고 던져주면 결국 매니저도 연산을 한뒤에 요청을 보낸 함수에게 알려주는 것이니까 결국 같은것아닌가용...??ㅎㅎ;;(원리가 잘 이해가 가지 않습니다) 2. 시간 매니저에게 "4초후에 무엇인가를 하고싶어!"라고 보냈을때는 매틱마다(매 프레임마다)연산을 하지않고 4초가지났다라고 해당함수에게 알려 줄 수 있는것인가요?? 3. 시간매니저에게 "4초후에 무엇인가를 하고싶어!"라는 요청을 보낼때 매프레임마다 연산을 하지않는다면 시간을 어떻게 재서 다시 해당함수에게 알려 줄 수 있는것인가요?? 4. 코루틴이 1~3경우에 굉장히 유용하다고 설명하셨는데 코루틴은 어떠한 원리로 "4초를 일시정지"햇다가 다시 실행시켜 줄 수 있는것인가요? 일시정지를 했더라도 일시정지한 시간을 알아야 4초후에 알려줄 수 있는거같은데...일시정지하는동안도 연산을 하나요??
-
미해결그림으로 배우는 쿠버네티스(v1.30) - {{ x86-64, arm64 }}
안녕하세요. 스토리지 클래스에 대해 궁금한 점이 있어서 질문 드립니다.
현재 상황은 로컬pc와 aws ec2 인스턴스를 site to site vpn방식으로 연결하고, 로컬 pc를 마스터노드, ec2 인스턴스를 워커노드 형태로 쿠버네티스 클러스터를 구성한 상태입니다. 같이 하는 친구가 워낙 잘 해서 이렇게 인프라를 잘 구축해줬는데, 저는 여기서 스토리지 클래스를 구성해 동적 프로비저닝을 구현하면 좋겠다는 생각을 해서 쿠버네티스 문서에 나와 있는 내용과 강의시간에 배운 내용들을 참조해 로컬 마스터노드에서 작업을 진행중인데 생각보다 구성이 잘 안 돼서 질문드리게 됐습니다. 로컬 마스터노드에서 스토리지클래스를 구성해 쿠버네티스 클러스터로 연결된 ec2 인스턴스의 ebs를 프로비저너로 사용하려고 하는데 이게 구현이 가능한 기술일까요? 계속 찾아보니 aws에서 제공해주는 aws storagegateway와 s3를 이용한 datasync가 있는데 꼭 이 방법을 이용해야 하는지 아니면 ec2와 연결된 ebs에 볼륨을 붙일 수 있을지 궁금해서 질문드립니다. 질문 드리기 전에 여기저기 찾아봐도 nfs를 이용한 스토리지클래스들의 예는 조금 찾아볼 수 있는데, 이 경우 마땅한 레퍼런스를 잘 찾을 수 없어 염치 불고하고 질문드립니다.
-
미해결타입 파이썬! 올바른 class 사용법과 객체지향 프로그래밍
타입힌트 실무에서 자주쓰이나요?
제가 아직 실무코드를 많이 못봐서그런데 장고나 flask는 보통 타입힌트가 없는데, fastapi는 타입힌트를 적극적으로 쓰는 것같더라구요. 장고나 flask에서도 타입힌트 자주 쓰이나요?
-
해결됨20년 경력자의 알기쉬운 컴퓨터네트워크
L2 스위치 관련 질문 드립니다.
안녕하세요, 틈틈히 유익한 강의 잘 보고 있습니다.스위치가 MAC소를 기반으로 Frame을 전달 하는 장치라고 배웠습니다. 그러면 이러한 스위치에서 맥 테이블 조회를 하여 각각의 스위치 포트들이 아래의 두가지 사항을 찾을 수 있는지 질문드립니다. 1. 각 포트들이 어디에 연결되어 있는지 알 수 있는지 2. 업링크 포트는 따로 표시 되는지(안된다면 기본적인 규칙이 있는지) 위의 두개 항목이 궁굼합니다..
-
미해결쉽고 빠르게 익히는 Power BI 심화 1 (시각화와 파워 쿼리)
Old 테이블이 추가되었을 때 데이터셋의 위치
Youth와 같은 포맷으로 Old 테이블을 만든 후 새로고침을 누르니 old chart가 생겨났는데요. 만약 저 데이터가 로컬에 저장된 데이터라면 당연히 - 데이터 변환을 통해서 이전 데이터셋을 전부 들어내고 새롭게 넣어줘야 하겠죠? 현 회사에서 원드라이브를 사용하는데 원드라이브와 연동시켜서 해야할까요? DB 는 없고 원드라이브에 월별 폴더를 만든 후 매번 들어가서 업데이트 하는 형식입니다.
-
미해결설계독학맛비's 실전 FPGA를 이용한 HW 가속기 설계 (LED 제어부터 Fully Connected Layer 가속기 설계까지)
design_1_wrapper를 만드시는 이유
- 학습 관련 질문을 남겨주세요. 상세히 작성하면 더 좋아요! - 먼저 유사한 질문이 있었는지 검색해보세요. - 서로 예의를 지키며 존중하는 문화를 만들어가요. - 잠깐! 인프런 서비스 운영 관련 문의는 1:1 문의하기를 이용해주세요. 강의 도중에 최상단은 verilog 파일이여야 한다며 design_1_wrapper를 만드시는 데요 이렇게 하는 이유가 무엇인가요? (Create Block Design으로 생성된 input output 포트를 연결하기 위해서 인가요?)
-
미해결데브옵스(DevOps)를 위한 쿠버네티스 마스터
파드 pending 상태 질문드립니다
노드의 리소스가 부족하여 파드 배포가 안되었을때 노드의 리소스를 확보하고 다시 파드를 배포해도 여전히 pending 상태더라고요 노드의 리소스가 많이 남아도 계속 pending 상태인데 혹시 이런경우 노드에 자동으로 no schdule나 테인트가 걸리나요?? 결국 디플로이먼트 yaml의 리소스 request 부분을 지우니깐 다시 running 상태가 되긴 하는데 왜 그런지 궁금합니다 요약: 노드의 리소스가 부족하여 파드 running이 안되어 리소스를 확보하고 다시 배포했는데 계속 pending 상태입니다.
-
미해결[개정판] 딥러닝 컴퓨터 비전 완벽 가이드
eiffcientDet 모델을 tflite변환
안녕하세요 저번에 tflite로 변환하는 과정을 질문한 학생입니다. 저번에 선생님께서 yolo보단 eiffcientDet를 사용해보라고 하셔서 모델을 변경하여 train했습니다. 해서 antoml의 git에 따르면 tflite로 변환하는 기능이 있어 사용하였는데 이와같은 에러가 나와서 문의 드립니다. batch_size문제로 확인되는데 어느부분이 문제인지 감이 안오네요...
-
해결됨모던 자바스크립트(javascript) 개발을 위한 ES6 강좌
타입 관련 궁금한 점이 있습니다.
안녕하세요 toString.call()을 이용하여 타입을 확인할 수 있다고 했는데, 언제 사용하나요 ?? typeof()와 비교했을 때, 차이점이 무엇인가요?
-
해결됨쉽게 배우고, 포트폴리오로 만드는 반응형 웹! #설화수
.show 앵글다운 아이콘(opacity 0) 안됩니다
아이콘 앵글업이 안 없어지고,(opacity 0이 안됩니다)
-
미해결비전공자를 위한 진짜 입문 올인원 개발 부트캠프
상품 정보를 받고 있습니다... 로 로딩만 되고 있어요! ㅠㅠ
이렇게 상품을 클릭해서 넘어가면 이렇게 뜹니다. ㅠㅠ 제가 어떤 부분을 잘못했는지 잘 모르겠어요! ㅠㅠ 그래도 혹시 몰라 소스코드들을 올려봅니닷 [ App.css ] /* 기존에 있는 리엑트 CSS를 지우고 */ html, body, #root, #root > div{ height: 100%; } #header{ height: 64px; display: flex; justify-content: center; border-bottom: 1px solid grey; } #body{ height: 100vh; width: 1024px; margin: 0 auto; padding-bottom: 24px; } #footer{ height: 200px; background-color: rgb(230, 230, 230); } [ App.js ] // import logo from './logo.svg'; import './App.css'; // import MainPage from './main/index.js'; >> ./main으로 해도 괜찮음! import MainPage from './main'; import { SWitch, Route } from 'react-router-dom' import ProductPage from './product'; import UploadPage from './upload'; import { Switch } from 'react-router-dom/cjs/react-router-dom.min'; function App() { return ( <div> <div id="header"> <div id="header-area"> {/* <img src="images/icons/logo.png"/> // 간혹 스스코드를 작성했는데 안된다면 이미지 소스의 경로를 절대적으로 바꿔주면 된다. << 절대경로*/} <img src="/images/icons/logo.png"/> </div> </div> <div id="body"> {/* 작성 방법_01 */} <Switch> <Route exact={true} path="/"> <MainPage /> </Route> <Route exact={true} path="/products/:id"> <ProductPage /> </Route> <Route exact={true} path="/upload"> <UploadPage /> </Route> </Switch> </div> <div id="footer"></div> </div> ); } export default App; [ main > index.js ] // 기존 우리가 작업했던 CSS를 연결해주기 // 그런데 문제가 있다 >> 따라서 App.css에서 수정을 해야 할 것들이 있다. import "./index.css" import axios from "axios"; //axios 서버에서 가져오겠다는 뜻 import React from 'react'; //React import {Link} from 'react-router-dom'; // Link를 뽑아 상품을 클릭 시 진행하도록 설정 function MainPage() { // return <p>MainPage</p> //React의 state로 상품정보에 대한 state이다. //state는 배열의 속성으로 우리는 useState([])로 빈 배열을 넣어줌. const [products, setProducts] = React.useState([]); //지속적인 업데이트를 통해 컴퓨터가 과부화가 생기지 않도록 React.useEffect를 활용한다. 즉 1번만 불러오고 정상적인 통신을 하는 것이다. React.useEffect( function () { //네트워크 postman에서 만든 MockServer의 product address를 가져오기 axios.get('https://4326fdea-003a-4291-b8b3-b8e47b10723c.mock.pstmn.io/products') .then(function (result) { console.log(result); const products = result.data.products; setProducts(products); //지속적인 반복의 문제점을 해결! }) .catch(function (error) { console.error("에러발생: ", error); }); }, [] ) return ( //// <div>를 하나 미리 만들어야 한다. 그리고서 Html > body에서 작성했던 모든 내용을 여기에 넣어준다. <div> {/* 여기서 이제 서버를 통한 product를 가져와야 한다. */} {/* <div id="header"> <div id="header-area"> <img src="images/icons/logo.png" /> </div> </div> <div id="body"> */} <div id="banner"> <img src="images/banners/banner1.png" /> </div> <h1>판매되는 상품들</h1> <div id="product-list"> { products.map(function (product, index) { // function(매개변수의 활용 > prodcut는 상품, index는 아이템 선택) return ( <div className='product-card'> {/* product-card, 즉 상품을 클릭 시 상품의 정보 페이지로 넘어갈 수 있도록 설정한다. 또한 React에서 Link를 통해 넘어는데 웹브라우저에서는 a태그로 보여준다. 즉, a태그로 변환이 되는 것을 확인할 수 있다. */} {/* <Link className="product-link" to={'/products/' + index}> */} {/* <Link className="product-link" to={`/products/ + ${index}`}> */} {/* 0을 눌렀을 때, 1을 눌렀을 때 모두 다른 상품이 나오도록 index를 통해 활용 그리고 벡틱 작성법을 활용 */} <Link className="product-link" to={`/products/ + ${product.id}`}> <div> <img className='product-img' src={product.imageUrl} /> </div> <div className='product-contents'> <span className='product-name'> {product.name} </span> <span className='product-price'> {product.price}원 </span> <div className='product-seller'> <img className='product-avatar' src='images/icons/avatar.png' /> <span> {product.seller} </span> </div> </div> </Link> </div> ); }) } </div> {/* </div> <div id="footer"> </div> */} </div> ); } export default MainPage; // MainPage를 내보내겠다는 뜻이다 [ prodcut > index.js ] import {useParams} from 'react-router-dom'; import axios from 'axios'; import { useEffect, useState } from 'react'; function ProductPage(){ //const prams = useParams(); // console.log(prams); //이제 상품을 선택을 할 때 콘솔창에 id: 0, 1, 2.. 상품 숫자가 나온다. const {id} = useParams(); const [product, setProduct] = useState(null); //처음엔 null이 나타나고 렌더링이 되고 나서 서버에서 상품을 받아오고 호출됨. useEffect(function(){ axios.get('https://4326fdea-003a-4291-b8b3-b8e47b10723c.mock.pstmn.io/products'+ id) .then(function(result){ setProduct(result.data); // console.log(result); }) .catch(function(error){ console.log(error); }) },[]); // console.log(product); // return <h1>상품 상세 페이지{id} 상품</h1> //js문법 if(product == null){ return <h1>상품 정보를 받고 있습니다...</h1> } return( <div> <div id="image-box"> <img src={"/"+product.imageUrl}/> {/* const [product, setProduct] = useState(null); 를 비동기 처리방식으로 작동이 되기 때문에 당연히 product.imageUrl은 null로 처리가 된다. 따라서 오류가 발생이 되는 것은 당연한 원리이다. */} </div> <div id="profile-box"> <img src="/images/icons/avatar.png"/> <span>{product.seller}</span> </div> </div> ); } export default ProductPage; postman에서 {{url}}/products의 주소 https://4326fdea-003a-4291-b8b3-b8e47b10723c.mock.pstmn.io 입니다 ㅠㅠ 그리고 GET prodcuts > Default { "products" : [ { "id" : 1, "name": "농구공", "price": 10000, "seller": "로뎀", "imageUrl": "images/products/basketball1.jpeg" }, { "id" : 2, "name": "축구공", "price": 50000, "seller": "RockLee", "imageUrl": "images/products/soccerball1.jpg" }, { "id" : 3, "name": "키보드", "price": 15000, "seller": "테란황제", "imageUrl": "images/products/keyboard1.jpg" } ] } GET prodcuts/1 { "id": 1, "name": "농구공", "price": 10000, "seller": "로뎀", "imageUrl": "images/products/basketball1.jpeg", "description": "조던이 사용하던 농구공입니다" } 죄송합니다 제가 많이 부족해서 계속 물어보네요 ㅠㅠ 계속 확인하고 확인해도 이번에는 어디서 문제가 있는지 모르겠어요ㅠ
-
미해결[하루 10분|Web Project] HTML/JS/CSS로 나만의 심리테스트 사이트 만들기
카카오 js 인증 키를 안보이게 하고 싶습니다.
발급받은 카카오 인증 키가 개발자 도구-소스를 통해 그대로 보이더라구요ㅠㅠ 인증 키가 노출되어도 상관없나요? 코드를 암호화 하는 등 안보이게 하는 방법이 있을까요...?
-
미해결실전! 스프링 부트와 JPA 활용1 - 웹 애플리케이션 개발
전반적으로 확인 위해 코드 다운
강의 듣는 속도가 느려서 우선 전반적으로 어떤식으로 돌아가는지 파악하기 위해 전체 소스 코드 다운 후 압축 풀고 -> 파일을 드래그해서 인텔리제이에서 열었을때 오류가 발생했습니다. 따로 설정을 추가로 해줘야 하는부분이 있는지 궁금합니다. 혹시 몰라 오류 전문 올립니다 오후 5:20:43: Executing task 'JpashopApplication.main()'... > Task :compileJava > Task :processResources > Task :classes > Task :JpashopApplication.main() . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v2.4.1) 2021-11-17 17:21:08.614 INFO 22472 --- [ main] jpabook.jpashop.JpashopApplication : Starting JpashopApplication using Java 11.0.9 on DESKTOP-BOB4SV1 with PID 22472 (C:\Users\ujung\Desktop\gtest\jpashop-v20210728\build\classes\java\main started by ujung in C:\Users\ujung\Desktop\gtest\jpashop-v20210728) 2021-11-17 17:21:08.627 INFO 22472 --- [ main] jpabook.jpashop.JpashopApplication : No active profile set, falling back to default profiles: default 2021-11-17 17:21:10.400 INFO 22472 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data JPA repositories in DEFAULT mode. 2021-11-17 17:21:10.445 INFO 22472 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 32 ms. Found 0 JPA repository interfaces. 2021-11-17 17:21:12.198 INFO 22472 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port(s): 8080 (http) 2021-11-17 17:21:12.233 INFO 22472 --- [ main] o.apache.catalina.core.StandardService : Starting service [Tomcat] 2021-11-17 17:21:12.233 INFO 22472 --- [ main] org.apache.catalina.core.StandardEngine : Starting Servlet engine: [Apache Tomcat/9.0.41] 2021-11-17 17:21:12.517 INFO 22472 --- [ main] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext 2021-11-17 17:21:12.518 INFO 22472 --- [ main] w.s.c.ServletWebServerApplicationContext : Root WebApplicationContext: initialization completed in 3708 ms 2021-11-17 17:21:12.977 INFO 22472 --- [ main] o.hibernate.jpa.internal.util.LogHelper : HHH000204: Processing PersistenceUnitInfo [name: default] 2021-11-17 17:21:13.161 INFO 22472 --- [ main] org.hibernate.Version : HHH000412: Hibernate ORM core version 5.4.25.Final 2021-11-17 17:21:13.700 INFO 22472 --- [ main] o.hibernate.annotations.common.Version : HCANN000001: Hibernate Commons Annotations {5.1.2.Final} 2021-11-17 17:21:14.154 INFO 22472 --- [ main] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Starting... 2021-11-17 17:21:17.545 ERROR 22472 --- [ main] com.zaxxer.hikari.pool.HikariPool : HikariPool-1 - Exception during pool initialization. org.h2.jdbc.JdbcSQLNonTransientConnectionException: Connection is broken: "java.net.SocketTimeoutException: connect timed out: localhost" [90067-200] at org.h2.message.DbException.getJdbcSQLException(DbException.java:622) ~[h2-1.4.200.jar:1.4.200] at org.h2.message.DbException.getJdbcSQLException(DbException.java:429) ~[h2-1.4.200.jar:1.4.200] at org.h2.message.DbException.get(DbException.java:194) ~[h2-1.4.200.jar:1.4.200] at org.h2.engine.SessionRemote.connectServer(SessionRemote.java:439) ~[h2-1.4.200.jar:1.4.200] at org.h2.engine.SessionRemote.connectEmbeddedOrServer(SessionRemote.java:321) ~[h2-1.4.200.jar:1.4.200] at org.h2.jdbc.JdbcConnection.<init>(JdbcConnection.java:173) ~[h2-1.4.200.jar:1.4.200] at org.h2.jdbc.JdbcConnection.<init>(JdbcConnection.java:152) ~[h2-1.4.200.jar:1.4.200] at org.h2.Driver.connect(Driver.java:69) ~[h2-1.4.200.jar:1.4.200] at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:138) ~[HikariCP-3.4.5.jar:na] at com.zaxxer.hikari.pool.PoolBase.newConnection(PoolBase.java:358) ~[HikariCP-3.4.5.jar:na] at com.zaxxer.hikari.pool.PoolBase.newPoolEntry(PoolBase.java:206) ~[HikariCP-3.4.5.jar:na] at com.zaxxer.hikari.pool.HikariPool.createPoolEntry(HikariPool.java:477) ~[HikariCP-3.4.5.jar:na] at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:560) ~[HikariCP-3.4.5.jar:na] at com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:115) ~[HikariCP-3.4.5.jar:na] at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:112) ~[HikariCP-3.4.5.jar:na] at org.hibernate.engine.jdbc.connections.internal.DatasourceConnectionProviderImpl.getConnection(DatasourceConnectionProviderImpl.java:122) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator$ConnectionProviderJdbcConnectionAccess.obtainConnection(JdbcEnvironmentInitiator.java:180) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator.initiateService(JdbcEnvironmentInitiator.java:68) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator.initiateService(JdbcEnvironmentInitiator.java:35) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.boot.registry.internal.StandardServiceRegistryImpl.initiateService(StandardServiceRegistryImpl.java:101) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.createService(AbstractServiceRegistryImpl.java:263) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.initializeService(AbstractServiceRegistryImpl.java:237) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.getService(AbstractServiceRegistryImpl.java:214) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.id.factory.internal.DefaultIdentifierGeneratorFactory.injectServices(DefaultIdentifierGeneratorFactory.java:152) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.injectDependencies(AbstractServiceRegistryImpl.java:286) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.initializeService(AbstractServiceRegistryImpl.java:243) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.getService(AbstractServiceRegistryImpl.java:214) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.boot.internal.InFlightMetadataCollectorImpl.<init>(InFlightMetadataCollectorImpl.java:176) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.boot.model.process.spi.MetadataBuildingProcess.complete(MetadataBuildingProcess.java:127) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.metadata(EntityManagerFactoryBuilderImpl.java:1224) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:1255) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.springframework.orm.jpa.vendor.SpringHibernateJpaPersistenceProvider.createContainerEntityManagerFactory(SpringHibernateJpaPersistenceProvider.java:58) ~[spring-orm-5.3.2.jar:5.3.2] at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:365) ~[spring-orm-5.3.2.jar:5.3.2] at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.buildNativeEntityManagerFactory(AbstractEntityManagerFactoryBean.java:409) ~[spring-orm-5.3.2.jar:5.3.2] at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:396) ~[spring-orm-5.3.2.jar:5.3.2] at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.afterPropertiesSet(LocalContainerEntityManagerFactoryBean.java:341) ~[spring-orm-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1847) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1784) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:609) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:531) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1159) ~[spring-context-5.3.2.jar:5.3.2] at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:913) ~[spring-context-5.3.2.jar:5.3.2] at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:588) ~[spring-context-5.3.2.jar:5.3.2] at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:144) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:767) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:759) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:426) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.run(SpringApplication.java:326) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1309) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1298) ~[spring-boot-2.4.1.jar:2.4.1] at jpabook.jpashop.JpashopApplication.main(JpashopApplication.java:10) ~[main/:na] Caused by: java.net.SocketTimeoutException: connect timed out at java.base/java.net.PlainSocketImpl.waitForConnect(Native Method) ~[na:na] at java.base/java.net.PlainSocketImpl.socketConnect(PlainSocketImpl.java:107) ~[na:na] at java.base/java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:399) ~[na:na] at java.base/java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:242) ~[na:na] at java.base/java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:224) ~[na:na] at java.base/java.net.SocksSocketImpl.connect(SocksSocketImpl.java:403) ~[na:na] at java.base/java.net.Socket.connect(Socket.java:608) ~[na:na] at org.h2.util.NetUtils.createSocket(NetUtils.java:103) ~[h2-1.4.200.jar:1.4.200] at org.h2.util.NetUtils.createSocket(NetUtils.java:83) ~[h2-1.4.200.jar:1.4.200] at org.h2.engine.SessionRemote.initTransfer(SessionRemote.java:119) ~[h2-1.4.200.jar:1.4.200] at org.h2.engine.SessionRemote.connectServer(SessionRemote.java:435) ~[h2-1.4.200.jar:1.4.200] ... 51 common frames omitted 2021-11-17 17:21:17.549 WARN 22472 --- [ main] o.h.e.j.e.i.JdbcEnvironmentInitiator : HHH000342: Could not obtain connection to query metadata org.h2.jdbc.JdbcSQLNonTransientConnectionException: Connection is broken: "java.net.SocketTimeoutException: connect timed out: localhost" [90067-200] at org.h2.message.DbException.getJdbcSQLException(DbException.java:622) ~[h2-1.4.200.jar:1.4.200] at org.h2.message.DbException.getJdbcSQLException(DbException.java:429) ~[h2-1.4.200.jar:1.4.200] at org.h2.message.DbException.get(DbException.java:194) ~[h2-1.4.200.jar:1.4.200] at org.h2.engine.SessionRemote.connectServer(SessionRemote.java:439) ~[h2-1.4.200.jar:1.4.200] at org.h2.engine.SessionRemote.connectEmbeddedOrServer(SessionRemote.java:321) ~[h2-1.4.200.jar:1.4.200] at org.h2.jdbc.JdbcConnection.<init>(JdbcConnection.java:173) ~[h2-1.4.200.jar:1.4.200] at org.h2.jdbc.JdbcConnection.<init>(JdbcConnection.java:152) ~[h2-1.4.200.jar:1.4.200] at org.h2.Driver.connect(Driver.java:69) ~[h2-1.4.200.jar:1.4.200] at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:138) ~[HikariCP-3.4.5.jar:na] at com.zaxxer.hikari.pool.PoolBase.newConnection(PoolBase.java:358) ~[HikariCP-3.4.5.jar:na] at com.zaxxer.hikari.pool.PoolBase.newPoolEntry(PoolBase.java:206) ~[HikariCP-3.4.5.jar:na] at com.zaxxer.hikari.pool.HikariPool.createPoolEntry(HikariPool.java:477) ~[HikariCP-3.4.5.jar:na] at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:560) ~[HikariCP-3.4.5.jar:na] at com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:115) ~[HikariCP-3.4.5.jar:na] at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:112) ~[HikariCP-3.4.5.jar:na] at org.hibernate.engine.jdbc.connections.internal.DatasourceConnectionProviderImpl.getConnection(DatasourceConnectionProviderImpl.java:122) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator$ConnectionProviderJdbcConnectionAccess.obtainConnection(JdbcEnvironmentInitiator.java:180) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator.initiateService(JdbcEnvironmentInitiator.java:68) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator.initiateService(JdbcEnvironmentInitiator.java:35) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.boot.registry.internal.StandardServiceRegistryImpl.initiateService(StandardServiceRegistryImpl.java:101) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.createService(AbstractServiceRegistryImpl.java:263) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.initializeService(AbstractServiceRegistryImpl.java:237) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.getService(AbstractServiceRegistryImpl.java:214) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.id.factory.internal.DefaultIdentifierGeneratorFactory.injectServices(DefaultIdentifierGeneratorFactory.java:152) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.injectDependencies(AbstractServiceRegistryImpl.java:286) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.initializeService(AbstractServiceRegistryImpl.java:243) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.getService(AbstractServiceRegistryImpl.java:214) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.boot.internal.InFlightMetadataCollectorImpl.<init>(InFlightMetadataCollectorImpl.java:176) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.boot.model.process.spi.MetadataBuildingProcess.complete(MetadataBuildingProcess.java:127) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.metadata(EntityManagerFactoryBuilderImpl.java:1224) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:1255) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.springframework.orm.jpa.vendor.SpringHibernateJpaPersistenceProvider.createContainerEntityManagerFactory(SpringHibernateJpaPersistenceProvider.java:58) ~[spring-orm-5.3.2.jar:5.3.2] at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:365) ~[spring-orm-5.3.2.jar:5.3.2] at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.buildNativeEntityManagerFactory(AbstractEntityManagerFactoryBean.java:409) ~[spring-orm-5.3.2.jar:5.3.2] at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:396) ~[spring-orm-5.3.2.jar:5.3.2] at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.afterPropertiesSet(LocalContainerEntityManagerFactoryBean.java:341) ~[spring-orm-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1847) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1784) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:609) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:531) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1159) ~[spring-context-5.3.2.jar:5.3.2] at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:913) ~[spring-context-5.3.2.jar:5.3.2] at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:588) ~[spring-context-5.3.2.jar:5.3.2] at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:144) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:767) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:759) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:426) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.run(SpringApplication.java:326) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1309) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1298) ~[spring-boot-2.4.1.jar:2.4.1] at jpabook.jpashop.JpashopApplication.main(JpashopApplication.java:10) ~[main/:na] Caused by: java.net.SocketTimeoutException: connect timed out at java.base/java.net.PlainSocketImpl.waitForConnect(Native Method) ~[na:na] at java.base/java.net.PlainSocketImpl.socketConnect(PlainSocketImpl.java:107) ~[na:na] at java.base/java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:399) ~[na:na] at java.base/java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:242) ~[na:na] at java.base/java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:224) ~[na:na] at java.base/java.net.SocksSocketImpl.connect(SocksSocketImpl.java:403) ~[na:na] at java.base/java.net.Socket.connect(Socket.java:608) ~[na:na] at org.h2.util.NetUtils.createSocket(NetUtils.java:103) ~[h2-1.4.200.jar:1.4.200] at org.h2.util.NetUtils.createSocket(NetUtils.java:83) ~[h2-1.4.200.jar:1.4.200] at org.h2.engine.SessionRemote.initTransfer(SessionRemote.java:119) ~[h2-1.4.200.jar:1.4.200] at org.h2.engine.SessionRemote.connectServer(SessionRemote.java:435) ~[h2-1.4.200.jar:1.4.200] ... 51 common frames omitted 2021-11-17 17:21:17.610 ERROR 22472 --- [ main] j.LocalContainerEntityManagerFactoryBean : Failed to initialize JPA EntityManagerFactory: Unable to create requested service [org.hibernate.engine.jdbc.env.spi.JdbcEnvironment] 2021-11-17 17:21:17.619 WARN 22472 --- [ main] ConfigServletWebServerApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'entityManagerFactory' defined in class path resource [org/springframework/boot/autoconfigure/orm/jpa/HibernateJpaConfiguration.class]: Invocation of init method failed; nested exception is org.hibernate.service.spi.ServiceException: Unable to create requested service [org.hibernate.engine.jdbc.env.spi.JdbcEnvironment] 2021-11-17 17:21:17.631 INFO 22472 --- [ main] o.apache.catalina.core.StandardService : Stopping service [Tomcat] 2021-11-17 17:21:17.674 INFO 22472 --- [ main] ConditionEvaluationReportLoggingListener : Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled. 2021-11-17 17:21:17.758 ERROR 22472 --- [ main] o.s.boot.SpringApplication : Application run failed org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'entityManagerFactory' defined in class path resource [org/springframework/boot/autoconfigure/orm/jpa/HibernateJpaConfiguration.class]: Invocation of init method failed; nested exception is org.hibernate.service.spi.ServiceException: Unable to create requested service [org.hibernate.engine.jdbc.env.spi.JdbcEnvironment] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1788) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:609) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:531) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1159) ~[spring-context-5.3.2.jar:5.3.2] at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:913) ~[spring-context-5.3.2.jar:5.3.2] at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:588) ~[spring-context-5.3.2.jar:5.3.2] at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:144) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:767) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:759) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:426) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.run(SpringApplication.java:326) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1309) ~[spring-boot-2.4.1.jar:2.4.1] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1298) ~[spring-boot-2.4.1.jar:2.4.1] at jpabook.jpashop.JpashopApplication.main(JpashopApplication.java:10) ~[main/:na] Caused by: org.hibernate.service.spi.ServiceException: Unable to create requested service [org.hibernate.engine.jdbc.env.spi.JdbcEnvironment] at org.hibernate.service.internal.AbstractServiceRegistryImpl.createService(AbstractServiceRegistryImpl.java:275) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.initializeService(AbstractServiceRegistryImpl.java:237) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.getService(AbstractServiceRegistryImpl.java:214) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.id.factory.internal.DefaultIdentifierGeneratorFactory.injectServices(DefaultIdentifierGeneratorFactory.java:152) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.injectDependencies(AbstractServiceRegistryImpl.java:286) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.initializeService(AbstractServiceRegistryImpl.java:243) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.getService(AbstractServiceRegistryImpl.java:214) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.boot.internal.InFlightMetadataCollectorImpl.<init>(InFlightMetadataCollectorImpl.java:176) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.boot.model.process.spi.MetadataBuildingProcess.complete(MetadataBuildingProcess.java:127) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.metadata(EntityManagerFactoryBuilderImpl.java:1224) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:1255) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.springframework.orm.jpa.vendor.SpringHibernateJpaPersistenceProvider.createContainerEntityManagerFactory(SpringHibernateJpaPersistenceProvider.java:58) ~[spring-orm-5.3.2.jar:5.3.2] at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:365) ~[spring-orm-5.3.2.jar:5.3.2] at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.buildNativeEntityManagerFactory(AbstractEntityManagerFactoryBean.java:409) ~[spring-orm-5.3.2.jar:5.3.2] at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:396) ~[spring-orm-5.3.2.jar:5.3.2] at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.afterPropertiesSet(LocalContainerEntityManagerFactoryBean.java:341) ~[spring-orm-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1847) ~[spring-beans-5.3.2.jar:5.3.2] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1784) ~[spring-beans-5.3.2.jar:5.3.2] ... 17 common frames omitted Caused by: org.hibernate.HibernateException: Access to DialectResolutionInfo cannot be null when 'hibernate.dialect' not set at org.hibernate.engine.jdbc.dialect.internal.DialectFactoryImpl.determineDialect(DialectFactoryImpl.java:100) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.engine.jdbc.dialect.internal.DialectFactoryImpl.buildDialect(DialectFactoryImpl.java:54) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator.initiateService(JdbcEnvironmentInitiator.java:137) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator.initiateService(JdbcEnvironmentInitiator.java:35) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.boot.registry.internal.StandardServiceRegistryImpl.initiateService(StandardServiceRegistryImpl.java:101) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] at org.hibernate.service.internal.AbstractServiceRegistryImpl.createService(AbstractServiceRegistryImpl.java:263) ~[hibernate-core-5.4.25.Final.jar:5.4.25.Final] ... 34 common frames omitted > Task :JpashopApplication.main() FAILED 3 actionable tasks: 3 executed FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':JpashopApplication.main()'. > Process 'command 'C:/Program Files/Java/jdk-11.0.9/bin/java.exe'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org BUILD FAILED in 33s 오후 5:21:18: Task execution finished 'JpashopApplication.main()'.
-
미해결15일간의 빅데이터 파일럿 프로젝트
주제5 워크플로우 실행 에러
Subject 5 - Workflow 실행시 3번째 하이브 쿼리문에서 에러가 발생합니다. 로그는 2021-11-17 17:08:05,939 INFO org.apache.oozie.service.JPAService: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[] No results found 2021-11-17 17:08:06,056 INFO org.apache.oozie.command.wf.ActionStartXCommand: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@:start:] Start action [0000000-211117134747764-oozie-oozi-W@:start:] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10] 2021-11-17 17:08:06,089 INFO org.apache.oozie.action.control.StartActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@:start:] Starting action 2021-11-17 17:08:06,122 INFO org.apache.oozie.command.wf.ActionStartXCommand: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@:start:] [***0000000-211117134747764-oozie-oozi-W@:start:***]Action status=DONE 2021-11-17 17:08:06,132 INFO org.apache.oozie.command.wf.ActionStartXCommand: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@:start:] [***0000000-211117134747764-oozie-oozi-W@:start:***]Action updated in DB! 2021-11-17 17:08:07,005 INFO org.apache.oozie.action.control.StartActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@:start:] Action ended with external status [OK] 2021-11-17 17:08:07,619 INFO org.apache.oozie.service.JPAService: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@:start:] No results found 2021-11-17 17:08:07,684 INFO org.apache.oozie.command.wf.WorkflowNotificationXCommand: SERVER[server02.hadoop.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@:start:] No Notification URL is defined. Therefore nothing to notify for job 0000000-211117134747764-oozie-oozi-W@:start: 2021-11-17 17:08:07,688 INFO org.apache.oozie.command.wf.WorkflowNotificationXCommand: SERVER[server02.hadoop.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[] No Notification URL is defined. Therefore nothing to notify for job 0000000-211117134747764-oozie-oozi-W 2021-11-17 17:08:07,828 INFO org.apache.oozie.command.wf.ActionStartXCommand: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-537b] Start action [0000000-211117134747764-oozie-oozi-W@hive-537b] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10] 2021-11-17 17:08:07,848 INFO org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-537b] Starting action. Getting Action File System 2021-11-17 17:08:07,931 INFO org.apache.oozie.service.HadoopAccessorService: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-537b] Processing configuration file [/var/run/cloudera-scm-agent/process/274-oozie-OOZIE_SERVER/action-conf/default.xml] for action [default] and hostPort [*] 2021-11-17 17:08:07,966 INFO org.apache.oozie.service.HadoopAccessorService: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-537b] Processing configuration file [/var/run/cloudera-scm-agent/process/274-oozie-OOZIE_SERVER/action-conf/hive2.xml] for action [hive2] and hostPort [*] 2021-11-17 17:08:16,679 WARN org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-537b] Invalid configuration value [null] defined for launcher max attempts count, using default [2]. 2021-11-17 17:08:16,703 INFO org.apache.oozie.action.hadoop.YarnACLHandler: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-537b] Not setting ACLs because mapreduce.cluster.acls.enabled is set to false 2021-11-17 17:08:20,280 INFO org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-537b] checking action, hadoop job ID [application_1637124309648_0001] status [RUNNING] 2021-11-17 17:08:20,294 INFO org.apache.oozie.command.wf.ActionStartXCommand: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-537b] [***0000000-211117134747764-oozie-oozi-W@hive-537b***]Action status=RUNNING 2021-11-17 17:08:20,298 INFO org.apache.oozie.command.wf.ActionStartXCommand: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-537b] [***0000000-211117134747764-oozie-oozi-W@hive-537b***]Action updated in DB! 2021-11-17 17:08:20,312 INFO org.apache.oozie.command.wf.WorkflowNotificationXCommand: SERVER[server02.hadoop.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-537b] No Notification URL is defined. Therefore nothing to notify for job 0000000-211117134747764-oozie-oozi-W@hive-537b 2021-11-17 17:08:41,461 INFO org.apache.oozie.servlet.CallbackServlet: SERVER[server02.hadoop.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-537b] callback for action [0000000-211117134747764-oozie-oozi-W@hive-537b] 2021-11-17 17:08:41,990 INFO org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-537b] action completed, external ID [application_1637124309648_0001] 2021-11-17 17:08:42,046 INFO org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-537b] Action ended with external status [SUCCEEDED] 2021-11-17 17:08:42,469 INFO org.apache.oozie.service.JPAService: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-537b] No results found 2021-11-17 17:08:42,566 INFO org.apache.oozie.command.wf.ActionStartXCommand: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e14] Start action [0000000-211117134747764-oozie-oozi-W@hive-6e14] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10] 2021-11-17 17:08:42,592 INFO org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e14] Starting action. Getting Action File System 2021-11-17 17:08:47,110 WARN org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e14] Invalid configuration value [null] defined for launcher max attempts count, using default [2]. 2021-11-17 17:08:47,113 INFO org.apache.oozie.action.hadoop.YarnACLHandler: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e14] Not setting ACLs because mapreduce.cluster.acls.enabled is set to false 2021-11-17 17:08:48,576 INFO org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e14] checking action, hadoop job ID [application_1637124309648_0002] status [RUNNING] 2021-11-17 17:08:48,585 INFO org.apache.oozie.command.wf.ActionStartXCommand: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e14] [***0000000-211117134747764-oozie-oozi-W@hive-6e14***]Action status=RUNNING 2021-11-17 17:08:48,587 INFO org.apache.oozie.command.wf.ActionStartXCommand: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e14] [***0000000-211117134747764-oozie-oozi-W@hive-6e14***]Action updated in DB! 2021-11-17 17:08:48,601 INFO org.apache.oozie.command.wf.WorkflowNotificationXCommand: SERVER[server02.hadoop.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e14] No Notification URL is defined. Therefore nothing to notify for job 0000000-211117134747764-oozie-oozi-W@hive-6e14 2021-11-17 17:08:48,603 INFO org.apache.oozie.command.wf.WorkflowNotificationXCommand: SERVER[server02.hadoop.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-537b] No Notification URL is defined. Therefore nothing to notify for job 0000000-211117134747764-oozie-oozi-W@hive-537b 2021-11-17 17:10:36,799 INFO org.apache.oozie.servlet.CallbackServlet: SERVER[server02.hadoop.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e14] callback for action [0000000-211117134747764-oozie-oozi-W@hive-6e14] 2021-11-17 17:10:37,091 INFO org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e14] External Child IDs : [job_1637124309648_0003] 2021-11-17 17:10:37,099 INFO org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e14] action completed, external ID [application_1637124309648_0002] 2021-11-17 17:10:37,175 INFO org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e14] Action ended with external status [SUCCEEDED] 2021-11-17 17:10:37,296 INFO org.apache.oozie.service.JPAService: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e14] No results found 2021-11-17 17:10:37,408 INFO org.apache.oozie.command.wf.ActionStartXCommand: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e91] Start action [0000000-211117134747764-oozie-oozi-W@hive-6e91] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10] 2021-11-17 17:10:37,425 INFO org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e91] Starting action. Getting Action File System 2021-11-17 17:10:42,219 WARN org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e91] Invalid configuration value [null] defined for launcher max attempts count, using default [2]. 2021-11-17 17:10:42,222 INFO org.apache.oozie.action.hadoop.YarnACLHandler: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e91] Not setting ACLs because mapreduce.cluster.acls.enabled is set to false 2021-11-17 17:10:44,054 INFO org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e91] checking action, hadoop job ID [application_1637124309648_0004] status [RUNNING] 2021-11-17 17:10:44,060 INFO org.apache.oozie.command.wf.ActionStartXCommand: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e91] [***0000000-211117134747764-oozie-oozi-W@hive-6e91***]Action status=RUNNING 2021-11-17 17:10:44,060 INFO org.apache.oozie.command.wf.ActionStartXCommand: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e91] [***0000000-211117134747764-oozie-oozi-W@hive-6e91***]Action updated in DB! 2021-11-17 17:10:44,071 INFO org.apache.oozie.command.wf.WorkflowNotificationXCommand: SERVER[server02.hadoop.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e91] No Notification URL is defined. Therefore nothing to notify for job 0000000-211117134747764-oozie-oozi-W@hive-6e91 2021-11-17 17:10:44,073 INFO org.apache.oozie.command.wf.WorkflowNotificationXCommand: SERVER[server02.hadoop.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e14] No Notification URL is defined. Therefore nothing to notify for job 0000000-211117134747764-oozie-oozi-W@hive-6e14 2021-11-17 17:10:58,958 INFO org.apache.oozie.servlet.CallbackServlet: SERVER[server02.hadoop.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e91] callback for action [0000000-211117134747764-oozie-oozi-W@hive-6e91] 2021-11-17 17:10:59,128 INFO org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e91] action completed, external ID [application_1637124309648_0004] 2021-11-17 17:10:59,132 WARN org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e91] Launcher ERROR, reason: Main Class [org.apache.oozie.action.hadoop.Hive2Main], exit code [2] 2021-11-17 17:10:59,174 INFO org.apache.oozie.action.hadoop.Hive2ActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e91] Action ended with external status [FAILED/KILLED] 2021-11-17 17:10:59,186 INFO org.apache.oozie.command.wf.ActionEndXCommand: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e91] ERROR is considered as FAILED for SLA 2021-11-17 17:10:59,279 INFO org.apache.oozie.service.JPAService: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e91] No results found 2021-11-17 17:10:59,329 INFO org.apache.oozie.command.wf.ActionStartXCommand: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@Kill] Start action [0000000-211117134747764-oozie-oozi-W@Kill] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10] 2021-11-17 17:10:59,342 INFO org.apache.oozie.action.control.KillActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@Kill] Starting action 2021-11-17 17:10:59,752 INFO org.apache.oozie.command.wf.ActionStartXCommand: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@Kill] [***0000000-211117134747764-oozie-oozi-W@Kill***]Action status=DONE 2021-11-17 17:10:59,754 INFO org.apache.oozie.command.wf.ActionStartXCommand: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@Kill] [***0000000-211117134747764-oozie-oozi-W@Kill***]Action updated in DB! 2021-11-17 17:10:59,780 INFO org.apache.oozie.action.control.KillActionExecutor: SERVER[server02.hadoop.com] USER[admin] GROUP[-] TOKEN[] APP[Subject 5 - Workflow] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@Kill] Action ended with external status [OK] 2021-11-17 17:10:59,925 INFO org.apache.oozie.command.wf.WorkflowNotificationXCommand: SERVER[server02.hadoop.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@Kill] No Notification URL is defined. Therefore nothing to notify for job 0000000-211117134747764-oozie-oozi-W@Kill 2021-11-17 17:10:59,927 INFO org.apache.oozie.command.wf.WorkflowNotificationXCommand: SERVER[server02.hadoop.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[] No Notification URL is defined. Therefore nothing to notify for job 0000000-211117134747764-oozie-oozi-W 2021-11-17 17:10:59,927 INFO org.apache.oozie.command.wf.WorkflowNotificationXCommand: SERVER[server02.hadoop.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000000-211117134747764-oozie-oozi-W] ACTION[0000000-211117134747764-oozie-oozi-W@hive-6e91] No Notification URL is defined. Therefore nothing to notify for job 0000000-211117134747764-oozie-oozi-W@hive-6e91 Log Type: prelaunch.err Log Upload Time: Wed Nov 17 17:11:00 +0900 2021 Log Length: 0 Log Type: prelaunch.out Log Upload Time: Wed Nov 17 17:11:00 +0900 2021 Log Length: 70 Setting up env variables Setting up job resources Launching container Log Type: stderr Log Upload Time: Wed Nov 17 17:11:00 +0900 2021 Log Length: 4561 Showing 4096 bytes of 4561 total. Click here for the full log. aticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console. Set system property 'org.apache.logging.log4j.simplelog.StatusLogger.level' to TRACE to show Log4j2 internal initialization logging. Connecting to jdbc:hive2://server02.hadoop.com:10000/default Connected to: Apache Hive (version 2.1.1-cdh6.3.2) Driver: Hive JDBC (version 2.1.1-cdh6.3.2) Transaction isolation: TRANSACTION_REPEATABLE_READ 0: jdbc:hive2://server02.hadoop.com:10000/def> USE default; INFO : Compiling command(queryId=hive_20211117171057_9409a0f0-f0c7-4196-a475-a9290b9eccdb): USE default INFO : Semantic Analysis Completed INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null) INFO : Completed compiling command(queryId=hive_20211117171057_9409a0f0-f0c7-4196-a475-a9290b9eccdb); Time taken: 0.338 seconds INFO : Executing command(queryId=hive_20211117171057_9409a0f0-f0c7-4196-a475-a9290b9eccdb): USE default INFO : Starting task [Stage-0:DDL] in serial mode INFO : Completed executing command(queryId=hive_20211117171057_9409a0f0-f0c7-4196-a475-a9290b9eccdb); Time taken: 0.016 seconds INFO : OK No rows affected (0.542 seconds) 0: jdbc:hive2://server02.hadoop.com:10000/def> 0: jdbc:hive2://server02.hadoop.com:10000/def> insert overwrite local directory '/home/pilot-pjt/item-buy-list' . . . . . . . . . . . . . . . . . . . . . . .> ROW FORMAT DELIMITED . . . . . . . . . . . . . . . . . . . . . . .> FIELDS TERMINATED BY ',' . . . . . . . . . . . . . . . . . . . . . . .> select car_number, concat_ws("," , collect_set(item)) . . . . . . . . . . . . . . . . . . . . . . .> from managed_smartcar_item_buylis t_info . . . . . . . . . . . . . . . . . . . . . . .> group by car_number . . . . . . . . . . . . . . . . . . . . . . .> Error: Error while compiling statement: FAILED: RuntimeException Cannot create staging directory 'hdfs://server01.hadoop.com:8020/home/pilot-pjt/item-buy-list/.hive-staging_hive_2021-11-17_17-10-57_746_3560942864087896700-3': Permission denied: user=admin, access=WRITE, inode="/":hdfs:supergroup:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:400) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:256) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:194) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1855) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1839) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1798) at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:61) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3101) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1123) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:696) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675) (state=42000,code=40000) Closing: 0: jdbc:hive2://server02.hadoop.com:10000/default Log Type: stdout Log Upload Time: Wed Nov 17 17:11:00 +0900 2021 Log Length: 198282 Showing 4096 bytes of 198282 total. Click here for the full log. launch_container.sh jetty-jndi-9.3.25.v20180904.jar jersey-container-servlet-core-2.25.1.jar datanucleus-core-4.1.6.jar asm-tree-6.0.jar ------------------------ Script [hive-6e91.sql] content: ------------------------ USE default; insert overwrite local directory '/home/pilot-pjt/item-buy-list' ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' select car_number, concat_ws("," , collect_set(item)) from managed_smartcar_item_buylist_info group by car_number ------------------------ Beeline command arguments : -u jdbc:hive2://server02.hadoop.com:10000/default -n admin -p DUMMY -d org.apache.hive.jdbc.HiveDriver -f hive-6e91.sql -a delegationToken --hiveconf mapreduce.job.tags=oozie-418ffadb1764d24a19ec7f0c02056574 --hiveconf oozie.action.id=0000000-211117134747764-oozie-oozi-W@hive-6e91 --hiveconf oozie.child.mapreduce.job.tags=oozie-418ffadb1764d24a19ec7f0c02056574 --hiveconf oozie.action.rootlogger.log.level=INFO --hiveconf oozie.job.id=0000000-211117134747764-oozie-oozi-W --hiveconf oozie.HadoopAccessorService.created=true Fetching child yarn jobs tag id : oozie-418ffadb1764d24a19ec7f0c02056574 No child applications found ================================================================= >>> Invoking Beeline command line now >>> <<< Invocation of Beeline command completed <<< No child hadoop job is executed. java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:410) at org.apache.oozie.action.hadoop.LauncherAM.access$300(LauncherAM.java:55) at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:223) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:217) at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:153) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:141) Caused by: java.lang.SecurityException: Intercepted System.exit(2) at org.apache.oozie.action.hadoop.security.LauncherSecurityManager.checkExit(LauncherSecurityManager.java:57) at java.lang.Runtime.exit(Runtime.java:107) at java.lang.System.exit(System.java:971) at org.apache.oozie.action.hadoop.Hive2Main.runBeeline(Hive2Main.java:273) at org.apache.oozie.action.hadoop.Hive2Main.run(Hive2Main.java:250) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:104) at org.apache.oozie.action.hadoop.Hive2Main.main(Hive2Main.java:65) ... 16 more Intercepting System.exit(2) Failing Oozie Launcher, Main Class [org.apache.oozie.action.hadoop.Hive2Main], exit code [2] Oozie Launcher, uploading action data to HDFS sequence file: hdfs://server01.hadoop.com:8020/user/admin/oozie-oozi/0000000-211117134747764-oozie-oozi-W/hive-6e91--hive2/action-data.seq Stopping AM Callback notification attempts left 0 Callback notification trying http://server02.hadoop.com:11000/oozie/callback?id=0000000-211117134747764-oozie-oozi-W@hive-6e91&status=FAILED Callback notification to http://server02.hadoop.com:11000/oozie/callback?id=0000000-211117134747764-oozie-oozi-W@hive-6e91&status=FAILED succeeded Callback notification succeeded 이렇습니다. 무슨 문제일까요 ㅜㅜ 다른분 게시물에 나와있는 디렉토리 만들고 권한 변경은 이미 진행해보았지만 변화가 없었습니다.
-
미해결스프링 배치
taskExcecutor 지정해서 muti 쓰레드로 실행시 어플리케이션 종료되지 않는 이슈
안녕하세요. taskExcecutor 지정해서 muti 쓰레드로 실행시 어플리케이션 종료되지 않는데 이건 taskExecutor 내부적으로 뭔가가 떠있어서 그런건가요? taskExcecutor 지정 안했을 때는 정상 종료됩니다.
-
미해결Klaytn 클레이튼 블록체인 어플리케이션 만들기 - NFT
수강의사 있습니다.
안녕하세요 메일 소식받고 코멘트 남깁니다~! 강좌 수강할 의향 있습니다!
-
미해결실전! 스프링 부트와 JPA 활용1 - 웹 애플리케이션 개발
Could not find or load main class jpabook.jpashop.JpashopApplication 에러
하루전까지 잘 되었는데 갑자기 이런 오류가 떴습니다. JpashopApplication에서 실행했을때 오류 전문입니다. "C:\Program Files\Java\jdk-11.0.9\bin\java.exe" "-javaagent:C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2020.3\lib\idea_rt.jar=56326:C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2020.3\bin" -Dfile.encoding=UTF-8 jpabook.jpashop.JpashopApplication Error: Could not find or load main class jpabook.jpashop.JpashopApplication Caused by: java.lang.ClassNotFoundException: jpabook.jpashop.JpashopApplication Process finished with exit code 1 --------------------------------------------------- 이런 경우 어떤 식으로 디버깅을 해야하는지 궁금합니다.