www.digitalmars.com         C & C++   DMDScript  

digitalmars.D.learn - DerelictGL program draw nothing

reply "Zhenya" <zheny list.ru> writes:
Why this simple program don't show white square?

import std.stdio;

import derelict.opengl3.gl;
import derelict.glfw3.glfw3;

const uint width = 200;
const uint height = 200;

void init()
{
	glViewport(0,0,width,height);
	glMatrixMode(GL_PROJECTION);
	glLoadIdentity();
	glOrtho(-width,width,-height,height,-1,1);
	glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
}

void display()
{
	glClear(GL_COLOR_BUFFER_BIT);
	glBegin(GL_POLYGON);
		glVertex2d(0,0);
		glVertex2d(0,height);
		glVertex2d(width,height);
		glVertex2d(height,0);
	glEnd();
}

void main()
{
	DerelictGL.load();
	DerelictGLFW3.load();
	glfwInit();
	GLFWwindow window;
	window = glfwCreateWindow(width,height,GLFW_WINDOWED,"Hello 
DerelictGLFW3",null);
	init();
	bool opened = true;
	while(opened)
	{
		opened = !glfwGetWindowParam(window,GLFW_CLOSE_REQUESTED) && 
!glfwGetKey(window,GLFW_KEY_ESC);
		display();
		glfwSwapBuffers(window);
		glfwWaitEvents();
	}
	glfwTerminate();
}
Sep 03 2012
next sibling parent reply "cal" <callumenator gmail.com> writes:
On Monday, 3 September 2012 at 15:21:59 UTC, Zhenya wrote:
 Why this simple program don't show white square?
 void display()
 {
 	glClear(GL_COLOR_BUFFER_BIT);
 	glBegin(GL_POLYGON);
 		glVertex2d(0,0);
 		glVertex2d(0,height);
 		glVertex2d(width,height);
 		glVertex2d(height,0);        <<<<<<<< glVertex2d(width, 0)
 	glEnd();
 }
If that doesn't work, maybe you need to wind the vertices the other way to avoid backface culling, ie: glVertex2d(0, 0) glVertex2d(width, 0) glVertex2d(width, height) glVertex2d(0, height)
Sep 03 2012
parent reply "Zhenya" <zheny list.ru> writes:
On Monday, 3 September 2012 at 16:57:08 UTC, cal wrote:
 On Monday, 3 September 2012 at 15:21:59 UTC, Zhenya wrote:
 Why this simple program don't show white square?
 void display()
 {
 	glClear(GL_COLOR_BUFFER_BIT);
 	glBegin(GL_POLYGON);
 		glVertex2d(0,0);
 		glVertex2d(0,height);
 		glVertex2d(width,height);
 		glVertex2d(height,0);        <<<<<<<< glVertex2d(width, 0)
 	glEnd();
 }
If that doesn't work, maybe you need to wind the vertices the other way to avoid backface culling, ie: glVertex2d(0, 0) glVertex2d(width, 0) glVertex2d(width, height) glVertex2d(0, height)
that dosn't work
Sep 03 2012
parent reply "cal" <callumenator gmail.com> writes:
On Monday, 3 September 2012 at 17:02:46 UTC, Zhenya wrote:
 that dosn't work
How large is your window? glViewport(0,0,width,height); should really be setting to the window size - so if you make your window 800,800, this should be glViewport(0,0,800,800);
Sep 03 2012
parent reply "cal" <callumenator gmail.com> writes:
On Monday, 3 September 2012 at 17:08:55 UTC, cal wrote:
 On Monday, 3 September 2012 at 17:02:46 UTC, Zhenya wrote:
 that dosn't work
How large is your window? glViewport(0,0,width,height); should really be setting to the window size - so if you make your window 800,800, this should be glViewport(0,0,800,800);
Just saw it, never mind. Also, I think before you draw, you need to set the matrix mode to modelview, and load identity... glMatrixMode(GL_MODELVIEW); glLoadIdentity(); display();
Sep 03 2012
parent reply "Zhenya" <zheny list.ru> writes:
On Monday, 3 September 2012 at 17:12:04 UTC, cal wrote:
 On Monday, 3 September 2012 at 17:08:55 UTC, cal wrote:
 On Monday, 3 September 2012 at 17:02:46 UTC, Zhenya wrote:
 that dosn't work
How large is your window? glViewport(0,0,width,height); should really be setting to the window size - so if you make your window 800,800, this should be glViewport(0,0,800,800);
Just saw it, never mind. Also, I think before you draw, you need to set the matrix mode to modelview, and load identity... glMatrixMode(GL_MODELVIEW); glLoadIdentity(); display();
const uint width = 200; const uint height = 200;
 glMatrixMode(GL_MODELVIEW);
 glLoadIdentity();
 display();
I added it to this code,but nothing changed(
Sep 03 2012
parent "cal" <callumenator gmail.com> writes:
On Monday, 3 September 2012 at 17:16:54 UTC, Zhenya wrote:
 I added it to this code,but nothing changed(
Its a puzzler then, FWIW the following code works for me (I don't use GLFW, I have my own window routines, but the opengl-specific calls are the same). Window("window1", WindowState(0,0,200,200), Flag!"Create".yes, Flag!"Show".yes); Window.makeCurrent("window1"); glViewport(0,0,200,200); glClearColor(0,0,0,1); bool finish = false; while (!finish) { glMatrixMode(GL_PROJECTION); glLoadIdentity(); glOrtho(-200, 200, -200, 200, -1, 1); glClear( GL_COLOR_BUFFER_BIT ); glBegin(GL_POLYGON); glVertex2d(0, 0); glVertex2d(0, 200); glVertex2d(200, 200); glVertex2d(200, 0); glEnd(); Window().swapBuffers(); if (Window().keyState().keys[KEY.KC_ESCAPE]) finish = true; } Perhaps something in the GLFW init is modyfying some OpenGL defaults or something, you might like to ask on their forum.
Sep 03 2012
prev sibling parent reply "Ivan Agafonov" <armadil yandex.ru> writes:
On Monday, 3 September 2012 at 15:21:59 UTC, Zhenya wrote:
 Why this simple program don't show white square?

 import std.stdio;

 import derelict.opengl3.gl;
 import derelict.glfw3.glfw3;

 const uint width = 200;
 const uint height = 200;

 void init()
 {
 	glViewport(0,0,width,height);
 	glMatrixMode(GL_PROJECTION);
 	glLoadIdentity();
 	glOrtho(-width,width,-height,height,-1,1);
 	glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
 }

 void display()
 {
 	glClear(GL_COLOR_BUFFER_BIT);
 	glBegin(GL_POLYGON);
 		glVertex2d(0,0);
 		glVertex2d(0,height);
 		glVertex2d(width,height);
 		glVertex2d(height,0);
 	glEnd();
 }

 void main()
 {
 	DerelictGL.load();
 	DerelictGLFW3.load();
 	glfwInit();
 	GLFWwindow window;
 	window = glfwCreateWindow(width,height,GLFW_WINDOWED,"Hello 
 DerelictGLFW3",null);
 	init();
 	bool opened = true;
 	while(opened)
 	{
 		opened = !glfwGetWindowParam(window,GLFW_CLOSE_REQUESTED) && 
 !glfwGetKey(window,GLFW_KEY_ESC);
 		display();
 		glfwSwapBuffers(window);
 		glfwWaitEvents();
 	}
 	glfwTerminate();
 }
width and height must be int, not uint. After window = glfwCreateWindow(); put glfwMakeContextCurrent(window); And it will work!
Sep 03 2012
parent reply "Zhenya" <zheny list.ru> writes:
On Monday, 3 September 2012 at 18:47:25 UTC, Ivan Agafonov wrote:
 On Monday, 3 September 2012 at 15:21:59 UTC, Zhenya wrote:
 Why this simple program don't show white square?

 import std.stdio;

 import derelict.opengl3.gl;
 import derelict.glfw3.glfw3;

 const uint width = 200;
 const uint height = 200;

 void init()
 {
 	glViewport(0,0,width,height);
 	glMatrixMode(GL_PROJECTION);
 	glLoadIdentity();
 	glOrtho(-width,width,-height,height,-1,1);
 	glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
 }

 void display()
 {
 	glClear(GL_COLOR_BUFFER_BIT);
 	glBegin(GL_POLYGON);
 		glVertex2d(0,0);
 		glVertex2d(0,height);
 		glVertex2d(width,height);
 		glVertex2d(height,0);
 	glEnd();
 }

 void main()
 {
 	DerelictGL.load();
 	DerelictGLFW3.load();
 	glfwInit();
 	GLFWwindow window;
 	window = glfwCreateWindow(width,height,GLFW_WINDOWED,"Hello 
 DerelictGLFW3",null);
 	init();
 	bool opened = true;
 	while(opened)
 	{
 		opened = !glfwGetWindowParam(window,GLFW_CLOSE_REQUESTED) && 
 !glfwGetKey(window,GLFW_KEY_ESC);
 		display();
 		glfwSwapBuffers(window);
 		glfwWaitEvents();
 	}
 	glfwTerminate();
 }
width and height must be int, not uint. After window = glfwCreateWindow(); put glfwMakeContextCurrent(window); And it will work!
Thank you very much)
Sep 03 2012
parent reply "Zhenya" <zheny list.ru> writes:
But why it doesn't convert uint to int correctly?
Sep 04 2012
parent reply "Ivan Agafonov" <armadil yandex.ru> writes:
On Tuesday, 4 September 2012 at 07:32:47 UTC, Zhenya wrote:
 But why it doesn't convert uint to int correctly?
I dont know, small positive uint and int must have the same binary representation, and no need to conversion. Strangely... O! May be problem is here: glOrtho(-width,width,-height,height,-1,1); -width and -height Negative unsigned?
Sep 04 2012
parent "Zhenya" <zheny list.ru> writes:
On Tuesday, 4 September 2012 at 16:17:30 UTC, Ivan Agafonov wrote:
 On Tuesday, 4 September 2012 at 07:32:47 UTC, Zhenya wrote:
 But why it doesn't convert uint to int correctly?
I dont know, small positive uint and int must have the same binary representation, and no need to conversion. Strangely... O! May be problem is here: glOrtho(-width,width,-height,height,-1,1); -width and -height Negative unsigned?
:)Understood
Sep 04 2012