Skip to content

Fix build under Windows when enable BUILD_SHARED_LIBS #1100

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Merged
merged 2 commits into from
Apr 22, 2023

Conversation

howard0su
Copy link
Collaborator

Fix the build error when enable BUILD_SHARED_LIBS. Export all symbols as a solution for now.

@howard0su
Copy link
Collaborator Author

The build errors:

quantize.obj : error LNK2019: unresolved external symbol ggml_time_init referenced in function main [C:\GPT\llama.cpp\build\examples\quantize\quantize.vcxproj]
quantize.obj : error LNK2019: unresolved external symbol ggml_time_us referenced in function main [C:\GPT\llama.cpp\build\examples\quantize\quantize.vcxproj]
quantize.obj : error LNK2019: unresolved external symbol ggml_init referenced in function main [C:\GPT\llama.cpp\build\examples\quantize\quantize.vcxproj]
quantize.obj : error LNK2019: unresolved external symbol ggml_free referenced in function main [C:\GPT\llama.cpp\build\examples\quantize\quantize.vcxproj]
C:\GPT\llama.cpp\build\bin\Debug\quantize.exe : fatal error LNK1120: 4 unresolved externals [C:\GPT\llama.cpp\build\examples\quantize\quantize.vcxproj]
quantize-stats.obj : error LNK2019: unresolved external symbol "class std::vector<struct std::pair<class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> >,struct ggml_tensor *>,class std::allocator<struct std::pair<class std::basic_strin
g<char,struct std::char_traits<char>,class std::allocator<char> >,struct ggml_tensor *> > > & __cdecl llama_internal_get_tensor_map(struct llama_context *)" (?llama_internal_get_tensor_map@@YAAEAV?$vector@U?$pair@V?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@
@std@@PEAUggml_tensor@@@std@@V?$allocator@U?$pair@V?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@PEAUggml_tensor@@@std@@@2@@std@@PEAUllama_context@@@Z) referenced in function main [C:\GPT\llama.cpp\build\examples\quantize-stats\quantize-stats.vcxproj]
C:\GPT\llama.cpp\build\bin\Debug\quantize-stats.exe : fatal error LNK1120: 1 unresolved externals [C:\GPT\llama.cpp\build\examples\quantize-stats\quantize-stats.vcxproj]

@howard0su howard0su marked this pull request as ready for review April 21, 2023 13:27
Copy link
Member

@ggerganov ggerganov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a temporary solution until ggml starts exporting symbols explicitly (I keep postponing this change)

@ggerganov ggerganov merged commit 7e312f1 into ggml-org:master Apr 22, 2023
@howard0su
Copy link
Collaborator Author

Yes, exactly. I can write a def but llama_internal_get_tensor_map (test only) blocks me.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants